Skip to content
aiWikis.org

LLMWikis.org Source Memory Guide

This page explains how AIWikis.org should remember LLMWikis.org in detail without turning AIWikis into the handbook source.

Metadata

FieldValue
Source siteaiwikis.org
Source URLhttps://aiwikis.org/
Canonical AIWikis URLhttps://aiwikis.org/files/aiwikis/content-pages-019-llmwikis-source-memory-guide-md-d298e5ef/
Source referencecontent/pages/019-llmwikis-source-memory-guide.md
File typemd
Content categorypublic-content
Last fetched2026-05-03T02:48:13.1276041Z
Last changed2026-05-03T02:34:56.3075156Z
Content hashsha256:d298e5ef07ccd106613695877e896c087077f8392f0b14b2990ca213dda2c8b8
Import statuschanged
Raw source layerdata/sources/aiwikis/content-pages-019-llmwikis-source-memory-guide-md-d298e5ef07cc.md
Normalized source layerdata/normalized/aiwikis/content-pages-019-llmwikis-source-memory-guide-md-d298e5ef07cc.txt

Current File Content

Structure Preview

  • LLMWikis.org Source Memory Guide
  • What LLMWikis Owns
  • Public Handbook Shape
  • LLM Wiki Operating Pattern
  • Hot And Cold Memory Lesson
  • Source-Site Outcome Routing
  • Report Preservation Audit
  • Recent Agentic Work Sweep
  • Starter Bundle Boundary
  • Relationship To UAIX
  • What AIWikis Preserves
  • Update Triggers
  • Do Not Claim
  • AI-Agent Use

Raw Version

Local absolute paths are redacted in this public view. The source hash and source-side raw layer are based on the unredacted source file.

---
title: "LLMWikis.org Source Memory Guide"
slug: "llmwikis/source-memory-guide"
status: "working-draft"
trust_level: "reviewed"
source_status: "AIWikis guide to LLMWikis.org source memory; LLMWikis.org remains the handbook source"
audience: "humans, ai-agents, maintainers"
---

# LLMWikis.org Source Memory Guide

This page explains how AIWikis.org should remember LLMWikis.org in detail without turning AIWikis into the handbook source.

LLMWikis.org remains the handbook source for LLM Wiki structure, starter templates, metadata, trust labels, governance, source policy, AI-agent operating rules, build process, and practical guidance. AIWikis documents the pattern in use, preserves source-file evidence, records dogfood lessons, and connects LLM Wiki practice to UAIX AI Memory and UAI Project Handoff boundaries.

## What LLMWikis Owns

LLMWikis.org explains how to build and operate an LLM Wiki: a source-governed knowledge system that keeps raw inputs, compiled wiki pages, metadata, indexes, logs, and agent rules organized for humans and AI agents.

Its public handbook role includes:

- What an LLM Wiki is and why it exists.
- How to structure `raw/`, `wiki/`, `config/`, indexes, logs, and metadata.
- How source policy, trust labels, review state, privacy posture, and governance work.
- How AI agents should read, ingest, query, lint, and preserve evidence.
- How starter bundles should be shaped and attributed.
- How LLM Wikis differ from RAG systems, broad AI Memory, and compact handoff packets.

AIWikis should explain and demonstrate these patterns, but source claims about the pattern should point back to LLMWikis.org.

## Public Handbook Shape

Use LLMWikis.org first when a claim needs handbook authority.

| Need | Canonical LLMWikis Route |
| --- | --- |
| Concept | `https://llmwikis.org/what-is-an-llm-wiki/` |
| Why it matters | `https://llmwikis.org/why-llm-wikis/` |
| Build sequence | `https://llmwikis.org/how-to-build-an-llm-wiki/` |
| Structure | `https://llmwikis.org/llm-wiki-structure/` |
| Starter bundle | `https://llmwikis.org/starter-template/` |
| Content types | `https://llmwikis.org/content-types/` |
| Trust model | `https://llmwikis.org/trust-model/` |
| Metadata | `https://llmwikis.org/metadata-standard/` |
| AI-agent rules | `https://llmwikis.org/for-ai-agents/` |
| LLM Wiki versus RAG | `https://llmwikis.org/llm-wiki-vs-rag/` |
| LLM Wiki versus AI Memory | `https://llmwikis.org/llm-wiki-vs-ai-memory/` |
| Checklist | `https://llmwikis.org/checklist/` |
| Maturity model | `https://llmwikis.org/maturity-model/` |
| Security and privacy | `https://llmwikis.org/security-and-privacy/` |
| Roadmap | `https://llmwikis.org/roadmap/` |

The handbook also includes operational routes for architecture, ingest, query, lint, navigation, schema engineering, examples, and using LLM Wiki with UAI.

## LLM Wiki Operating Pattern

AIWikis should remember the LLM Wiki pattern as an operating loop, not just a documentation folder.

The core loop is:

1. Preserve raw source inputs with provenance.
2. Compile reviewed wiki nodes with metadata, trust labels, and citations.
3. Maintain indexes and logs so humans and AI agents can retrieve small slices.
4. Lint structure, links, metadata, and graph records.
5. Promote only compact active truth into handoff packets.
6. Keep raw archives and long reasoning history out of public pages until reviewed.

AIWikis dogfoods this with `raw/`, `wiki/`, `config/`, `config/graph.jsonld`, `tools/wiki-lint.ps1`, source-side nodes, generated reports, and public-safe reviewed pages.

## Hot And Cold Memory Lesson

The 2026-05-01 LlmWikis memory reorganization turned the handbook advice back on the source site. Long progress history and pre-slim handoff snapshots were copied to AIWikis raw memory first, then LlmWikis active handoff files were reduced to current operating truth.

Use that as the source-side rule:

- Keep `AGENTS.md`, `readme.human`, and loaded `.uai` files short enough for routine agent pickup.
- Keep historical build notes, old package lessons, pre-slim snapshots, and long route-strategy background in AIWikis unless a current task needs the original wording.
- Promote facts back into LlmWikis only through reviewed public handbook copy, `.uai` state, code, tests, release notes, package evidence, or roadmap state.

The transfer evidence is `raw/system-archives/2026-05-01-llmwikis-internal-memory-reorg.json`. The compiled retrieval nodes are `wiki/sources/llmwikis-internal-memory-reorg-2026-05-01.md`, `wiki/concepts/llmwikis-active-context-diet.md`, and `wiki/logs/llmwikis-internal-memory-reorg-2026-05-01.md`.

## Source-Site Outcome Routing

LLMWikis.org Improvement processing does not automatically publish AIWikis.org changes.

The 2026-05-01 integration reports exposed this gap. LLMWikis correctly turned the reports into staged review artifacts, but AIWikis needed a separate outcome-routing pass before public AIWikis pages, long-memory records, or package evidence changed.

Use the routing rule:

1. LLMWikis processes reports and records dispositions.
2. LLMWikis keeps staged drafts review-gated until promoted.
3. AIWikis records a public-safe outcome only when the work affects cross-site memory or the human explicitly asks for AIWikis visibility.
4. AIWikis preserves raw or staged source artifacts only with explicit acceptance for long memory.
5. AIWikis does not treat staged LLMWikis drafts as handbook truth until LLMWikis promotes them through its normal content path.

Read Source-Site Outcome Routing (source-relative: ./031-source-site-outcome-routing.md) for the public workflow.

## Report Preservation Audit

The 2026-05-01 source-site report preservation audit copied 19 LlmWikis archive files into AIWikis raw memory under `raw/system-archives/llmwikis/source-site-report-preservation/2026-05-01/`. The transfer manifest is `raw/system-archives/2026-05-01-source-site-report-preservation.json`.

This audit covers original integration reports, Codex completion and promotion reports, the publication plan, processed inventory, and staged-site drafts from `agent-file-handoff/Archive/2026-05-01/Improvement/llmwikis-integration-promoted/`.

Future agents should use `wiki/sources/source-site-report-preservation-2026-05-01.md`, `wiki/concepts/source-site-report-preservation.md`, and the Intake Outcome Ledger before opening raw reports. The raw files are provenance, not handbook truth until LLMWikis promotes reviewed facts through its normal content path.

## Recent Agentic Work Sweep

The 2026-05-03 recent-work sweep copied 25 LlmWikis current-source and processed-input files into AIWikis raw memory under `raw/system-archives/llmwikis/recent-work-sweep/2026-05-03/`. The transfer manifest is `raw/system-archives/2026-05-03-llmwikis-recent-work-sweep.json`.

This sweep covers the May 2-3 LlmWikis outcomes that were not covered by the older May 1 report preservation:

- the processed `Where we go from here LLMWikis.md` roadmap input
- `/guides/llm-wiki-agentic-orchestration/`
- the compact Agentic Orchestration Mode on `/for-ai-agents/`
- starter-bundle `ORCHESTRATION_RUNBOOK.md`, `TASK_PACKET_TEMPLATE.md`, and `SUPPORT_ESCALATION_CHECKLIST.md`
- route, discovery, test, and handoff-state updates around the agentic guidance

Future agents should use `wiki/sources/llmwikis-recent-work-sweep-2026-05-03.md`, `wiki/concepts/llmwikis-agentic-orchestration-support-current-state.md`, and `wiki/logs/llmwikis-recent-work-sweep-2026-05-03.md` before opening the copied source files. The raw files are provenance, not handbook truth if LLMWikis.org later changes.

The dogfood lesson is explicit: preserving old reports is not enough when later source-site work changes implementation files, tests, public routes, discovery files, starter assets, or support-boundary language. Those changes need a current-source digest when AIWikis is expected to retain long memory.

## Starter Bundle Boundary

LLMWikis.org publishes `llm-wiki-starter-bundle.zip` from its own source package process.

AIWikis may consume, observe, reference, and document that bundle. It must not relabel the bundle as an AIWikis-owned download, copy it into AIWikis packages as a local product, or imply LLMWikis.org has endorsed an AIWikis implementation beyond the source evidence.

This distinction matters because package names become authority signals. AIWikis should be a transparent consumer and dogfood site, not an upstream package publisher for the LLMWikis handbook.

## Relationship To UAIX

LLM Wiki and UAI AI Memory overlap but do different jobs.

LLMWikis.org teaches the deeper long-memory knowledge system. UAIX.org defines UAI-1 and the portable AI Memory / Project Handoff exchange layer. AIWikis shows how both can coexist:

- Use compact UAIX AI Memory packets when a receiving human or AI needs fast startup and transfer-of-ownership context.
- Use an LLM Wiki when the project needs deeper source history, contradictions, decisions, archive lineage, and long-lived retrieval.
- Use optional `LLM_WIKI_MEMORY_PLAN.md` guidance only when a project deliberately chooses LLM Wiki long memory.
- Keep UAIX source claims on UAIX.org and LLM Wiki handbook claims on LLMWikis.org.

This relationship is complementary, not hierarchical.

## What AIWikis Preserves

AIWikis preserves LLMWikis memory through several layers:

- `/llmwikis/` gives the source-system entry route.
- `/llmwikis/public/` summarizes public handbook content.
- `/llmwikis/uai-system/` exposes current source-system files used for handoff and memory.
- `/llmwikis/uai-system/files/...` gives individual source-file pages with raw public content, explanation, provenance, update history, and machine-readable metadata.
- `/llmwikis/provenance/` explains where imported evidence came from.
- `/concepts/llm-wiki/`, `/files/`, and `/reports/` support retrieval by concept, file, and evidence type.
- `raw/system-archives/llmwikis/` stores already-dispositioned archive memory after explicit transfer.
- `wiki/` and `config/graph.jsonld` hold AIWikis source-side memory about how the pattern behaved in practice.

This lets AIWikis document LLMWikis.org in detail while still sending authority-sensitive claims back to the handbook.

## Update Triggers

Update this AIWikis memory whenever LLMWikis changes:

- Starter bundle structure, file names, or package publishing policy.
- Trust labels, metadata standard, source policy, governance, or review rules.
- AI-agent rules, ingest, query, lint, index, log, or graph guidance.
- Agentic orchestration guidance, task packets, support escalation, or starter-bundle agent files.
- Security and privacy recommendations.
- LLM Wiki versus AI Memory or LLM Wiki with UAI relationship pages.
- Roadmap language separating launch-ready guidance from planned work.
- Public support boundaries around live benchmarks, automated ingestion, public MCP surfaces, contributor programs, grants, memberships, multilingual expansion, or hosted services.
- Processed Improvement packages when the human expects AIWikis to show the outcome; these need explicit outcome routing before they become AIWikis public memory.
- Processed LlmWikis archive reports when the human expects AIWikis to preserve the original report wording for long-term memory, not only staged summaries.

## Do Not Claim

AIWikis must not claim:

- It is the canonical LLMWikis handbook.
- It owns or republishes `llm-wiki-starter-bundle.zip` as an AIWikis product.
- Its local `raw/`, `wiki/`, and `config/` choices are universal LLMWikis requirements.
- LLMWikis.org defines UAI-1 or replaces UAIX.org.
- Planned LLMWikis features are already live.
- AIWikis archive copies are fresher or more authoritative than current LLMWikis.org pages.

When in doubt, AIWikis should say "LLMWikis.org remains the handbook source" and link back to the source route.

## AI-Agent Use

When an AI agent needs LLM Wiki truth, it should:

1. Start at LLMWikis.org for handbook claims.
2. Use AIWikis to inspect applied source-file evidence, provenance, archive outcomes, and implementation lessons.
3. Prefer the smallest route that answers the question.
4. Keep public pages, compact handoff packets, and deep source-side memory separate.
5. Preserve the distinction between optional LLM Wiki long memory and UAIX AI Memory exchange packets.

AIWikis is strongest when it answers: what does the handbook say, where is the source, how did the pattern behave in a real site, what evidence exists, and what remains future or out of scope?

Why This File Exists

This is a public content source file from aiwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.

Role

This file is a focused source unit. Its path, headings, and metadata give an agent a retrieval handle that is smaller than loading the entire site or repository.

Structure

The file is structured around these visible headings: LLMWikis.org Source Memory Guide; What LLMWikis Owns; Public Handbook Shape; LLM Wiki Operating Pattern; Hot And Cold Memory Lesson; Source-Site Outcome Routing; Report Preservation Audit; Recent Agentic Work Sweep. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.

Prompt-Size And Retrieval Benefit

Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.

How To Use It

  • Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
  • LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
  • Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
  • Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.

Update Requirements

When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.

Related Pages

Provenance And History

  • Current observation: 2026-05-03T02:48:13.1276041Z
  • Source origin: current-source-workspace
  • Retrieval method: local-source-workspace
  • Duplicate group: sfg-397 (primary)
  • Historical hash records are stored in data/hashes/source-file-history.jsonl.

Machine-Readable Metadata

{
    "title":  "LLMWikis.org Source Memory Guide",
    "source_site":  "aiwikis.org",
    "source_url":  "https://aiwikis.org/",
    "canonical_url":  "https://aiwikis.org/files/aiwikis/content-pages-019-llmwikis-source-memory-guide-md-d298e5ef/",
    "source_reference":  "content/pages/019-llmwikis-source-memory-guide.md",
    "file_type":  "md",
    "content_category":  "public-content",
    "content_hash":  "sha256:d298e5ef07ccd106613695877e896c087077f8392f0b14b2990ca213dda2c8b8",
    "last_fetched":  "2026-05-03T02:48:13.1276041Z",
    "last_changed":  "2026-05-03T02:34:56.3075156Z",
    "import_status":  "changed",
    "duplicate_group_id":  "sfg-397",
    "duplicate_role":  "primary",
    "related_files":  [

                      ],
    "generated_explanation":  true,
    "explanation_last_generated":  "2026-05-03T02:48:13.1276041Z"
}