LlmWikis Project Context
LlmWikis is a WordPress Studio workspace for building LlmWikis.org, a public handbook for creating, structuring, maintaining, and auditing personal and team LLM Wikis.
Metadata
| Field | Value |
|---|---|
| Source site | aiwikis.org |
| Source URL | https://aiwikis.org/ |
| Canonical AIWikis URL | https://aiwikis.org/files/aiwikis/raw-system-archives-llmwikis-recent-work-sweep-2026-05-03-uai-context-ua-524b2a65/ |
| Source reference | raw/system-archives/llmwikis/recent-work-sweep/2026-05-03/.uai/context.uai |
| File type | uai |
| Content category | uai-system |
| Last fetched | 2026-05-03T02:48:13.1276041Z |
| Last changed | 2026-05-03T02:24:42.9230622Z |
| Content hash | sha256:524b2a659ec595367591aabcd59501ba98f92cb321f4b6cdc6b476a1048a116d |
| Import status | new |
| Raw source layer | data/sources/aiwikis/raw-system-archives-llmwikis-recent-work-sweep-2026-05-03-uai-context-uai-524b2a659ec5.uai |
| Normalized source layer | data/normalized/aiwikis/raw-system-archives-llmwikis-recent-work-sweep-2026-05-03-uai-context-uai-524b2a659ec5.txt |
Current File Content
Structure Preview
- LlmWikis Project Context
- What This Is
- Source Inputs
- Primary Goal
- Audience
- Current Public Truth
- Success Criteria
Raw Version
Local absolute paths are redacted in this public view. The source hash and source-side raw layer are based on the unredacted source file.
---
uaix: "1.0"
type: context
title: "LlmWikis Project Context"
created: "2026-04-26"
updated: "2026-05-03"
author: "LlmWikis maintainers"
version: 17
---
# LlmWikis Project Context
## What This Is
LlmWikis is a WordPress Studio workspace for building LlmWikis.org, a public handbook for creating, structuring, maintaining, and auditing personal and team LLM Wikis.
The prelaunch production posture is handbook-first. LlmWikis teaches the reusable LLM Wiki pattern: immutable `raw/` sources, a compiled markdown `wiki/`, a compact schema file such as `AGENTS.md` or `CLAUDE.md`, deterministic `index.md` and `log.md` navigation, and the `ingest` / `query` / `lint` operating loop. UAIX remains the canonical UAI-1 standards publication site, and UAIX/UAI content now belongs to the non-normative protocol case-study track.
## Source Inputs
The former repository `docs/` Markdown planning set has been migrated and retired. Durable source value now lives in:
- `.uai/strategy.uai` for internal strategy, source-document disposition, and roadmap gates
- `.uai/progress.uai`, `.uai/public-surface.uai`, `.uai/architecture.uai`, `.uai/stack.uai`, and `.uai/operations.uai` for current project state
- Public handbook routes for LLM Wiki architecture, operations, navigation, schema engineering, examples, and templates
- Promoted public integration routes for using LLM Wiki with Codex, using LLM Wiki with agentic agents and orchestration layers, aligning LLM Wiki with UAI AI Memory and UAI Project Handoff, and review-gated publication
- Internal improvement/backlog state for article-quality audits, UI/UX fixes, SEO reports, and other site-improvement work
- `/reports/using-llm-wiki-with-uai/` for the public non-normative UAI integration report
- The public UAIX Project Handoff pattern used by `[local path redacted]`
- Root `readme.human` as the human-facing Project Handoff companion that tells people what the AI sees, protects, and needs clarified before work is steered
- Chat-start local file intake through `agent-file-handoff/Content/` for candidate public/editorial material and `agent-file-handoff/Improvement/` for audits, QA findings, SEO reports, bug notes, and site-fix work; `.uai/file-handoff.uai`, `.uai/intake-index.uai`, and `scripts/Invoke-UaiFileIntake.ps1` keep both buckets visible during AGENTS.md load. Already-dispositioned files belong in `agent-file-handoff/Archive/`, which routine AI intake ignores unless a human explicitly names an archived file or moves it back into an active bucket. Any `needs-agent-review` files must be inspected, summarized, and dispositioned before unrelated broad work, while watcher/daemon, queue-folder, manifest, always-on background processes, and bucket-local README instructions are discouraged so the pattern stays portable.
- AIWikis long-term memory for cold context that should not be loaded by default. The 2026-05-01 LlmWikis pre-slim handoff snapshot is preserved in `[local path redacted]` with compiled retrieval nodes in AIWikis `wiki/`.
## Primary Goal
Make LlmWikis.org the practical public reference for LLM Wiki creation while keeping source-backed coverage, contributor guardrails, crawler discovery, and UAIX authority boundaries clear.
## Audience
- Personal knowledge-base builders
- Team documentation maintainers
- Developers building AI-maintained markdown knowledge bases
- Technical writers and documentation architects designing AI-readable docs
- Governance and safety reviewers who need source-backed pages and review states
- Tool builders mapping protocols, registries, and agent workflows as case studies
## Current Public Truth
- LlmWikis is now handbook-first for LLM Wiki creation.
- LlmWikis.org is being prepared for first public launch; do not treat obsolete prelaunch routes as public SEO history.
- LlmWikis public copy should not attribute the LLM Wiki concept by creator name; it may link directly to the foundational idea file as a reference.
- LlmWikis is not the UAI-1 canonical standards site.
- UAIX.org is canonical for UAI-1 specs, schemas, registry, validator behavior, roadmap, governance, and Project Handoff guidance.
- AIWikis is cold memory and provenance for processed source material; it is not the LlmWikis handbook source and not UAIX authority.
- LlmWikis operates as a human-readable, non-normative explainer and catalog surface.
- Staged improvement reports are not public truth by default; they become public copy only after source review, authority checks, sensitive-data review, targeted verification, and explicit promotion.
- Live benchmark integrations, automated literature ingestion, public MCP access, broad public editing, monetization, grants, and multilingual support are planned or candidate work, not current support.
- Public agentic orchestration and support-escalation guidance is current as a handbook pattern for working around existing runtimes. It does not mean LlmWikis ships a public MCP server, A2A implementation, write API, trace exporter, live eval integration, managed runtime, official adapter, SDK, CLI, certification, support service, or endorsement.
## Success Criteria
- The site loads locally and presents a coherent launch-ready public surface.
- Readers can tell what LlmWikis is, what UAIX owns, and what remains future work.
- The home page exposes Start Here, Architecture, Operations, Navigation, Schema Engineering, Examples, and Protocols and Case Studies paths.
- The public site includes Reports for long-form analysis, while article-quality audits and other site-improvement reports stay internal unless deliberately converted into reader-facing content.
- Readers can learn the three-layer architecture, ingest/query/lint loop, and index.md/log.md navigation without private guidance.
- AI crawlers can find the root `llms.txt` content map.
- A new AI can load `AGENTS.md`, `readme.human`, and `.uai` files and work without private chat history.
- Files dropped by humans, other AI systems, or adjacent tools are processed into local intake records during AGENTS.md-triggered chat-start intake and reviewed immediately before unrelated broad AI work, without becoming public content automatically.
- Hot handoff context stays concise enough for routine agent pickup, while old history remains recoverable from AIWikis cold memory.
Why This File Exists
This is a UAI AI Memory handoff file from aiwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.
Role
This .uai file is a compact context packet. It keeps one kind of durable project truth separate from the rest of the archive so an agent can load the topic it needs without pulling the whole project history into prompt context.
Structure
The file is structured around these visible headings: LlmWikis Project Context; What This Is; Source Inputs; Primary Goal; Audience; Current Public Truth; Success Criteria. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.
Prompt-Size And Retrieval Benefit
Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.
How To Use It
- Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
- LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
- Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
- Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.
Update Requirements
When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.
Related Pages
Provenance And History
- Current observation:
2026-05-03T02:48:13.1276041Z - Source origin:
current-source-workspace - Retrieval method:
local-source-workspace - Duplicate group:
sfg-150(primary) - Historical hash records are stored in
data/hashes/source-file-history.jsonl.
Machine-Readable Metadata
{
"title": "LlmWikis Project Context",
"source_site": "aiwikis.org",
"source_url": "https://aiwikis.org/",
"canonical_url": "https://aiwikis.org/files/aiwikis/raw-system-archives-llmwikis-recent-work-sweep-2026-05-03-uai-context-ua-524b2a65/",
"source_reference": "raw/system-archives/llmwikis/recent-work-sweep/2026-05-03/.uai/context.uai",
"file_type": "uai",
"content_category": "uai-system",
"content_hash": "sha256:524b2a659ec595367591aabcd59501ba98f92cb321f4b6cdc6b476a1048a116d",
"last_fetched": "2026-05-03T02:48:13.1276041Z",
"last_changed": "2026-05-03T02:24:42.9230622Z",
"import_status": "new",
"duplicate_group_id": "sfg-150",
"duplicate_role": "primary",
"related_files": [
],
"generated_explanation": true,
"explanation_last_generated": "2026-05-03T02:48:13.1276041Z"
}