Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor
This report evaluates how to combine four distinct but complementary systems: **LLM Wiki / LLMWikis.org** as the durable, reviewed knowledge layer; **UAI AI Memory** as the compact, portable operating-memory packet; *...
Metadata
| Field | Value |
|---|---|
| Source site | llmwikis.org |
| Source URL | https://llmwikis.org/ |
| Canonical AIWikis URL | https://aiwikis.org/llmwikis/uai-system/files/raw-system-archives-llmwikis-source-site-report-preservation-2026-05-01-63ebca3b/ |
| Source reference | raw/system-archives/llmwikis/source-site-report-preservation/2026-05-01/agent-file-handoff/Archive/2026-05-01/Improvement/llmwikis-integration-promoted/Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor.md |
| File type | md |
| Content category | memory-file |
| Last fetched | 2026-05-06T17:58:24.5168382Z |
| Last changed | 2026-05-01T18:17:43.7481597Z |
| Content hash | sha256:63ebca3ba315e2186103ad25ce56e537ae4ba8268ff26a1d9f3afd9ce21b0bdc |
| Import status | unchanged |
| Raw source layer | data/sources/llmwikis/raw-system-archives-llmwikis-source-site-report-preservation-2026-05-01-agent-file-handoff-archi-63ebca3ba315.md |
| Normalized source layer | data/normalized/llmwikis/raw-system-archives-llmwikis-source-site-report-preservation-2026-05-01-agent-file-handoff-archi-63ebca3ba315.txt |
Current File Content
Structure Preview
- Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor
- Executive summary
- System profiles
- Interfaces and data contracts
- APIs and protocols
- Data models
- Authentication and authorization
- Hosting and deployment options
- Architecture and data flow
- Operational considerations
- Synchronization and conflict resolution
- Latency and scalability
- Failure modes and recovery
- Integration options and recommendation
- Implementation plan
- Recommended target architecture
- Discover the current public UAIX machine surface
- Validate a UAI-1 packet
- Milestones and deliverables
- Open questions and limitations
Raw Version
This public page shows a bounded preview of a large source file. The complete source remains in the raw and normalized source layers named in metadata, with the SHA-256 hash above for verification.
- Source characters:
32627 - Preview characters:
11483
# Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor
## Executive summary
This report evaluates how to combine four distinct but complementary systems: **LLM Wiki / LLMWikis.org** as the durable, reviewed knowledge layer; **UAI AI Memory** as the compact, portable operating-memory packet; **UAI Project Handoff** as the repository takeover pattern built around `AGENTS.md`, `readme.human`, and `.uai` files; and the **Cursor** coding agent as the active execution surface for capture, retrieval, synthesis, and presentation. UAIX’s own guidance is explicit that LLM Wiki and AI Memory solve different problems: the wiki should remain broad and background-oriented, while AI Memory bundles remain compact and decisive; Project Handoff is the transfer-focused subtype of AI Memory for work that must move safely between humans, teams, vendors, or agents. citeturn16view1turn17view0turn19view4
The key architectural conclusion is that **these systems should not be tightly merged into one automatically synchronized memory plane**. UAIX says wiki memory is “background until reviewed and promoted,” and LLMWikis likewise recommends staged ingest, visible contradictions, and lint/review before durable updates. That makes a **middleware-centered architecture with event-driven jobs** the best fit: Cursor reads a small UAI package first, escalates to the LLM Wiki when deeper context is needed, stages proposed changes, and promotes only reviewed facts into AI Memory / Project Handoff artifacts; public interoperability evidence is then expressed as UAI-1 records and validated through UAIX’s published routes. citeturn17view0turn20view1turn16view0turn6view5
On interfaces, the strongest official machine surface is **UAIX’s public UAI-1 REST layer**, which publishes discovery, schemas, examples, OpenAPI, validation, mock-exchange, and conformance-pack routes. LLMWikis.org, by contrast, currently presents itself as a handbook, starter bundle, and governance model rather than a public live API or public MCP service. Cursor is the most automation-ready runtime in this set: it supports local and cloud agents, MCP servers, cloud automations, a Cloud Agents API, a TypeScript SDK, plugin manifests, hooks, and extension APIs for programmatic MCP and plugin registration. citeturn6view0turn6view5turn8view0turn25view0turn25view1turn31search7turn36search0turn34search2turn40view0turn40view4turn42search0turn44search2
One naming ambiguity should be noted. I did not locate a primary documentation source for a separate official product called **“Curser”** in the reviewed primary sources. The official coding-agent documentation available is for **Cursor** at cursor.com, covering agents, MCP, cloud agents, automations, plugins, SDK, and enterprise automation. This report therefore treats “Curser coding agent” as **Cursor**, which is the highest-confidence interpretation from the official materials reviewed. citeturn31search7turn31search0turn31search4turn45search0
## System profiles
The four systems have different responsibilities and should be modeled that way in the integration.
| System | Primary role | Official source | Core data shape | Best use |
|---|---|---|---|---|
| **LLM Wiki / LLMWikis.org** | Durable, reviewed organizational knowledge system for humans and agents. citeturn18view0turn19view4 | LLMWikis handbook pages and starter guidance. citeturn2view0turn18view0 | Markdown pages with frontmatter such as `title`, `owner`, `status`, `last_reviewed`, `review_cycle`, `sensitivity`, `agent_use`, and related links; plus navigational files like `README`, `INDEX`, `TRUST_MODEL`, `CHANGELOG`, `index.md`, and `log.md`. citeturn18view1turn21view2turn22view1 | Long-lived research, policy background, architecture knowledge, decision trails, onboarding, and source summaries. citeturn18view0turn16view1 |
| **UAI AI Memory** | Portable, file-based operating context bundle. citeturn4view0 | UAIX AI Memory canonical page and Package Wizard. citeturn4view0turn1view0 | Manifested bundle with fields such as `bundle_id`, lifecycle, trust boundary notes, included files, template IDs, and checksums; generated outputs can include a startup packet, system profile, receiver brief, manifest overlay, and optional `LLM_WIKI_MEMORY_PLAN.md`. citeturn4view0turn17view0turn1view0 | Compact current state, constraints, owners, next actions, and task-ready context across sessions and systems. citeturn16view1turn16view6 |
| **UAI Project Handoff** | Repository takeover / transfer-of-ownership pattern. citeturn14view0turn14view5 | UAIX Project Handoff canonical page and AGENTS.md spec background. citeturn4view1turn4view4 | Root `AGENTS.md`, `readme.human`, and `.uai` records; minimum bundle includes `.uai/context.uai`, `.uai/stack.uai`, and `.uai/constraints.uai`; explicit `@uai[...]` load references. citeturn14view1turn14view2turn13view5 | Safe transfer of project responsibility between agents, teams, vendors, or companies. citeturn14view0turn13view3 |
| **Cursor** | Active coding/automation runtime. citeturn31search7turn31search0 | Cursor docs, Cloud Agents docs, SDK docs, plugin docs. citeturn31search7turn34search0turn34search2turn40view4 | Agent runs, MCP server configs in `.cursor/mcp.json` or `~/.cursor/mcp.json`, plugin manifests, hooks, cloud runs, automations, and SDK-managed run events/artifacts. citeturn40view0turn40view4turn34search2 | Capturing repository context, querying external tools, staging edits, producing presentations/pages, and automating promotion pipelines. citeturn34search0turn34search1turn34search2 |
The most important conceptual boundary is this: **LLM Wiki is the durable knowledge base; AI Memory is the portable working packet; Project Handoff is the transfer packet; Cursor is the execution engine**. That division is not just convenient; it matches the official guidance on both UAIX and LLMWikis. citeturn16view1turn17view0turn19view4
## Interfaces and data contracts
### APIs and protocols
**UAIX / UAI-1** is the most formal protocol surface in scope. UAIX describes UAI-1 as the public envelope, trust declaration, and evidence layer for AI-to-AI exchange, positioned beside MCP, A2A, orchestration, and OpenAPI rather than as a replacement for them. The current public release includes six profiles: `uai.intent.request.v1`, `uai.intent.response.v1`, `uai.capability.statement.v1`, `uai.error.v1`, `uai.conformance.result.v1`, and `uai.task.status.v1`. It also publishes field registry, transport bindings, trust channels, error registry, conformance levels, examples, validator behavior, and an OpenAPI surface. citeturn11view0turn11view1turn11view3turn10view4turn8view0
UAIX’s public REST surface includes `catalog`, `discovery`, `schemas`, `registry`, `field-registry`, `transport-bindings`, `trust-channels`, `conformance-levels`, `error-registry`, `examples`, `validate`, `adoption-kit`, `mock-exchange`, `openapi.json`, `conformance-pack`, and `roadmap`. The docs explicitly instruct implementers to start from the catalog/discovery routes and then move to OpenAPI, validation, mock exchange, and conformance pack instead of reverse-engineering the public WordPress REST surface. citeturn6view0turn6view5turn8view0
**LLMWikis.org** does not currently present a comparable public execution API. The handbook explicitly says its current limits include **no open editing, no public MCP, no certification, and no multilingual support**. The tooling guidance also says that protocol integrations may exist privately or in the future, but LLMWikis “does not currently claim public MCP server support.” In practical terms, the official LLM Wiki surface is a **content model, governance model, and folder/schema convention**, not a hosted write/read service contract. citeturn25view0turn25view1turn25view4
**Project Handoff** is similarly file- and repo-centric rather than network-centric. Its operative protocol is local loading of `AGENTS.md`, `readme.human`, and `.uai` files, with the preferred include syntax `@uai[path]`. The Project Handoff page also distinguishes two `.uai` record profiles: a Markdown context profile for readable repository knowledge, and a JSON information profile intended for stricter machine records with fields such as `schemaVersion`, `name`, `version`, `provenance`, `links`, and optional `checksum` and `signature`. citeturn14view1turn14view2turn15view5
**Cursor** provides the runtime integration surface you would actually use. Officially documented capabilities include local and cloud agents, MCP client support, cloud automations, a run-based Cloud Agents REST API, a TypeScript SDK, plugin manifests, hooks, and extension APIs. Cursor’s MCP docs show server config fields such as `command`, `args`, `env`, `envFile`, `url`, and `headers`, as well as project-level `.cursor/mcp.json` and global `~/.cursor/mcp.json`. Cursor also documents extension APIs for programmatic registration of MCP servers and plugin paths, which is particularly relevant if you want a standardized UAIX/LLM Wiki bridge distributed as an internal extension or plugin. citeturn40view0turn34search0turn34search1turn36search0turn34search2turn42search0turn44search2
### Data models
At the UAI-1 message layer, the shared envelope keeps `uai_version`, `profile`, `message_id`, `source`, `target`, `conversation`, `delivery`, `trust`, `body`, `provenance`, `integrity`, and `extensions` explicit. UAIX stresses that every public packet should preserve identity/direction, workflow continuity, trust context, business meaning, and auditability on the record. citeturn11view0turn11view1turn11view3
At the **AI Memory** layer, the data model is the bundle manifest plus the file deck. The published Project AI Memory starter manifest includes `bundle_id`, `name`, `description`, `intended_use_case`, `lifecycle`, `download_filename`, `trust_boundary_notes`, `included_files`, `shared_files`, `bundle_specific_files`, `optional_sections`, `overlays`, and per-file metadata including `template_id`, source IDs, byte counts, and `sha256` hashes. That is a strong fit for checksum-based drift detection and reproducible packaging. citeturn4view0
At the **Project Handoff** layer, the data model is slightly richer at the repository boundary: the minimum useful bundle includes `AGENTS.md`, `readme.human`, `.uai/context.uai`, `.uai/stack.uai`, and `.uai/constraints.uai`, with additional typed artifacts like `HANDOFF_BRIEF.md`, `.uai/architecture.uai`, `.uai/progress.uai`, `.uai/operations.uai`, `.uai/test-plan.uai`, and `.uai/decisions.uai` available in the Project Handoff starter. The official intent is to keep takeover context both human-readable and machine-loadable. citeturn14view1turn14view3turn14view4
At the **LLM Wiki** layer, the formally recommended page metadata includes `title`, `owner`, `status`, `last_reviewed`, `review_cycle`, `audience`, `sensitivity`, `agent_use`, `human_review_required_for_updates`, and `related`. The wiki is also structured around distinct lifecycle states such as raw source, source summary, semantic page, procedural page, contradiction record, and archive. This data model is suitable for deep memory, but intentionally heavier than the compact AI Memory / Project Handoff packet. citeturn18view1turn20view2turn22view1
### Authentication and authorization
Why This File Exists
This is a memory-system evidence file from llmwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.
Role
This file is memory-system evidence. It records source history, archive transfer, intake disposition, or another piece of provenance that should be retrievable without becoming an unsupported public claim.
Structure
The file is structured around these visible headings: Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor; Executive summary; System profiles; Interfaces and data contracts; APIs and protocols; Data models; Authentication and authorization; Hosting and deployment options. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.
Prompt-Size And Retrieval Benefit
Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.
How To Use It
- Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
- LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
- Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
- Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.
Update Requirements
When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.
Related Pages
Provenance And History
- Current observation:
2026-05-06T17:58:24.5168382Z - Source origin:
current-source-workspace - Retrieval method:
local-source-workspace - Duplicate group:
sfg-263(primary) - Historical hash records are stored in
data/hashes/source-file-history.jsonl.
Machine-Readable Metadata
{
"title": "Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor",
"source_site": "llmwikis.org",
"source_url": "https://llmwikis.org/",
"canonical_url": "https://aiwikis.org/llmwikis/uai-system/files/raw-system-archives-llmwikis-source-site-report-preservation-2026-05-01-63ebca3b/",
"source_reference": "raw/system-archives/llmwikis/source-site-report-preservation/2026-05-01/agent-file-handoff/Archive/2026-05-01/Improvement/llmwikis-integration-promoted/Governed Integration of LLM Wiki, UAI AI Memory, UAI Project Handoff, and Cursor.md",
"file_type": "md",
"content_category": "memory-file",
"content_hash": "sha256:63ebca3ba315e2186103ad25ce56e537ae4ba8268ff26a1d9f3afd9ce21b0bdc",
"last_fetched": "2026-05-06T17:58:24.5168382Z",
"last_changed": "2026-05-01T18:17:43.7481597Z",
"import_status": "unchanged",
"duplicate_group_id": "sfg-263",
"duplicate_role": "primary",
"related_files": [
],
"generated_explanation": true,
"explanation_last_generated": "2026-05-06T17:58:24.5168382Z"
}