Skip to content
aiWikis.org

Source Memory Operations Playbook

This playbook tells future humans and AI agents how to keep AIWikis.org useful as the long-term memory for UAIX.org, LLMWikis.org, and Protocol5.com.

Metadata

FieldValue
Source siteaiwikis.org
Source URLhttps://aiwikis.org/
Canonical AIWikis URLhttps://aiwikis.org/files/aiwikis/content-pages-028-source-memory-operations-playbook-md-10e808c3/
Source referencecontent/pages/028-source-memory-operations-playbook.md
File typemd
Content categorypublic-content
Last fetched2026-05-03T02:48:13.1276041Z
Last changed2026-05-03T02:40:08.5194341Z
Content hashsha256:10e808c31ea6bc83549601d840ae4d9d7c8005af23890400170ce2b450b02bc7
Import statuschanged
Raw source layerdata/sources/aiwikis/content-pages-028-source-memory-operations-playbook-md-10e808c31ea6.md
Normalized source layerdata/normalized/aiwikis/content-pages-028-source-memory-operations-playbook-md-10e808c31ea6.txt

Current File Content

Structure Preview

  • Source Memory Operations Playbook
  • Start Of Work
  • Source Change Workflow
  • Source-Site Outcome Routing Workflow
  • UAIX-Specific Checks
  • LLMWikis-Specific Checks
  • Protocol5-Specific Checks
  • Archive Consolidation Workflow
  • Public Page Workflow
  • Generated Evidence Workflow
  • Handoff Export Workflow
  • Package Workflow
  • Definition Of Done
  • Stop And Ask

Raw Version

Local absolute paths are redacted in this public view. The source hash and source-side raw layer are based on the unredacted source file.

---
title: "Source Memory Operations Playbook"
slug: "source-memory-operations-playbook"
status: "working-draft"
trust_level: "reviewed"
source_status: "AIWikis operating playbook for maintaining cross-site source memory"
audience: "humans, ai-agents, maintainers"
---

# Source Memory Operations Playbook

This playbook tells future humans and AI agents how to keep AIWikis.org useful as the long-term memory for UAIX.org, LLMWikis.org, and Protocol5.com.

The goal is not to collect more files. The goal is to preserve source truth, explain it clearly, keep provenance visible, reduce prompt load, and prevent unsupported claims from leaking into public pages or packages.

## Start Of Work

Every broad AIWikis work session should begin with:

1. Read `AGENTS.md`.
2. Read `readme.human`.
3. Run `scripts/Invoke-UaiFileIntake.ps1`.
4. Load the `.uai` files listed in `AGENTS.md`.
5. Inspect every `needs-agent-review` file before unrelated broad work.
6. Name the source authority for the task before editing.
7. Decide which public routes, source-side records, generated reports, and package checks need to change.

If a task touches UAIX, LLMWikis, or Protocol5 claims, read the relevant source memory guide before editing public copy.

## Source Change Workflow

Use this workflow when UAIX.org, LLMWikis.org, or Protocol5.com changes.

1. Identify the source-system owner.
2. Read the current source route or local source workspace.
3. Decide whether the change affects public claims, package ownership, source-file evidence, archive memory, roadmap state, or support boundaries.
4. Update the matching source memory guide if the change affects durable interpretation.
5. Run `tools/sync-source-files.ps1 -DryRun`.
6. Run `tools/sync-source-files.ps1` when the dry run is acceptable.
7. Inspect `reports/source-file-inventory.md`, `reports/changed-files.md`, `reports/generated-explanations.md`, and `reports/final-content-map.md`.
8. Update `.uai/progress.uai`, `.uai/decisions.uai`, docs, lessons, recommendations, and roadmap when durable memory changes.
9. Run `tools/smoke-test.ps1`.
10. Package only when the work should refresh deployable artifacts or release evidence.

## Source-Site Outcome Routing Workflow

Use this workflow when a source repository processes intake and the human expects AIWikis.org to visibly change.

1. Identify the source-site result: active report, staged draft, reviewed edit, archived record, package output, or deployment claim.
2. Confirm whether the result is source-site truth, candidate guidance, long-memory evidence, or blocked speculation.
3. Compare the original processed source files against existing AIWikis raw-memory manifests. If the human expects final long memory, preserve originals, not only staged summaries.
4. Compare current implementation, test, discovery, starter, route, package, and support-boundary files against the older manifests. If newer source-state work changed, create a current-source digest with copied files and checksums.
5. Update the Intake Outcome Ledger or Source-Site Outcome Routing (source-relative: ./031-source-site-outcome-routing.md) so the public-safe answer says what happened.
6. Update source memory guides, the atlas, this playbook, coverage, claim boundaries, roadmap, lessons, recommendation adjustments, changelog, `.uai` records, and wiki memory when durable operating truth changed.
7. Preserve staged, archived, or current-source artifacts under `raw/system-archives/` only when explicitly accepted for AIWikis long memory.
8. Run source sync when current source evidence or generated file pages should change.
9. Run wiki lint when `raw/`, `wiki/`, or `config/graph.jsonld` changed.
10. Run handoff export when `.uai`, `AGENTS.md`, `readme.human`, recommendation docs, or export-eligible wiki nodes changed.
11. Run smoke checks after public page, generated route, package-visible, or source-memory navigation changes.
12. Do not deploy unless the human explicitly authorizes the production workflow.

If AIWikis does not change, record that answer in the completion report instead of leaving the expectation implicit.

## UAIX-Specific Checks

For UAIX changes, verify:

- UAIX.org remains canonical for UAI-1, AI Memory, Project Handoff, Agent File Handoff, validator, conformance, governance, roadmap, and support boundaries.
- The AI Memory Package Wizard is described as a populated operating-profile generator when relevant.
- Generated wizard outputs are named accurately: system profile, receiver brief, startup packet, overlay JSON, package model JSON, optional LLM Wiki memory plan, and starter ZIP links.
- Current support is separated from hosted upload/import validation, automatic repository writes, automatic LLM Wiki sync, SDK, CLI, certification, endorsement, and deployment claims.
- `uai1-project-handoff.zip` stays UAIX-owned even when mirrored for AIWikis dogfood install.

## LLMWikis-Specific Checks

For LLMWikis changes, verify:

- LLMWikis.org remains the handbook source for LLM Wiki structure, starter templates, metadata, trust labels, governance, source policy, and agent rules.
- `llm-wiki-starter-bundle.zip` stays LLMWikis-owned.
- Local AIWikis `raw/`, `wiki/`, and `config/` choices are framed as dogfood implementation, not universal requirements.
- Launch-ready pages are separated from planned live benchmarks, automated ingestion, public MCP surfaces, contributor programs, grants, memberships, multilingual expansion, or hosted services.
- The relationship to UAIX is described as complementary: LLM Wiki for deeper memory, UAI AI Memory for portable exchange and handoff packets.
- Staged LLMWikis Improvement outputs are described as candidate guidance until promoted by LLMWikis review.

## Protocol5-Specific Checks

For Protocol5 changes, verify:

- Protocol5.com remains the parent public brand for Mathematics and the UAI .NET Hub.
- Protocol5.com owns implementation package distribution, NuGet artifacts, starter ZIPs, route contracts, compatibility mirrors, and ASP.NET support notes.
- UAIX.org remains normative for UAI standards, schemas, registry, validator behavior, governance, roadmap, changelog, and conformance.
- Protocol5 package mirrors are labeled as implementation support or compatibility views, not standards authority.
- Prime and Fibonacci generated route trees stay out of AIWikis source sync unless a human explicitly asks for compatibility analysis.
- Protocol5 package existence is not described as certification, endorsement, conformance approval, or production deployment.

## Archive Consolidation Workflow

Use this only for already-dispositioned archive material or explicit human requests.

1. Confirm the source file is in a source-site archive or has a clear disposition.
2. Copy it into `raw/system-archives/{site}/` without overwriting existing records.
3. Record transfer evidence and checksums.
4. Add source-side wiki nodes when the material contains durable lessons, decisions, contradictions, or roadmap implications.
5. Publish only public-safe summaries.
6. Update the Intake Outcome Ledger.
7. Refresh active intake so the index returns to zero pending files.
8. Leave the source archive in place unless a human explicitly asks for source-side cleanup after transfer evidence and discoverability are verified.

Do not publish raw archive content directly as public pages unless it has been reviewed and rewritten as public-safe guidance.

## Public Page Workflow

For reviewed public pages:

1. Keep pages focused enough that an agent can load them without wasting context.
2. Put source authority near the top.
3. Use tables for route maps, artifact ownership, and claim boundaries.
4. Link to generated file pages or indexes when exact evidence matters.
5. Link to source sites when authority matters.
6. Put "Do Not Claim" or blocked-support language near high-risk topics.
7. Update Home, Start Here, Roadmap, docs, and `.uai` state when adding a first-class page.

The page should answer one durable retrieval question. If it answers five unrelated questions, split it.

## Generated Evidence Workflow

Transparent source-file sync is how AIWikis proves the memory layer is current.

Run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/sync-source-files.ps1 -DryRun
powershell -NoProfile -ExecutionPolicy Bypass -File tools/sync-source-files.ps1
```

Then inspect:

- `reports/source-file-inventory.md`
- `reports/changed-files.md`
- `reports/generated-explanations.md`
- `reports/duplicate-resolution.md`
- `reports/broken-links.md`
- `reports/final-content-map.md`
- `data/hashes/source-file-history.jsonl`

Generated pages may expose current source-file content and metadata, but they must redact local absolute Windows paths from public HTML.

## Handoff Export Workflow

When `.uai`, `AGENTS.md`, `readme.human`, recommendation docs, or export-eligible wiki nodes change, run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/generate-handoff-export.ps1
```

The export is local dogfood evidence. It does not become UAIX certification, hosted generation support, public validation, SDK, CLI, endorsement, or UAI-1 conformance.

## Package Workflow

Run package builds when deployable artifacts or release evidence should change.

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/package.ps1
```

Package checks should confirm:

- Theme ZIP root is `aiwikis-authority`.
- Plugin ZIP root is `aiwikis-core`.
- Reviewed public Markdown pages are included under `aiwikis-core/data/pages`.
- Obsolete `aiwikis-starter-pack*.zip` files are absent.
- Source-package ownership labels remain intact.
- No secrets, local private paths, raw private material, `.git`, `node_modules`, `.env`, cache folders, or build junk enter ZIPs.

## Definition Of Done

A deeper AIWikis memory change is done when:

- The right source authority is named.
- Public pages are updated or deliberately left alone with a reason.
- Generated source-file evidence is refreshed when current files changed.
- `.uai` files capture compact durable truth.
- Lessons, recommendation adjustments, changelog, and roadmap are aligned when policy or workflow changed.
- Smoke tests pass.
- Package artifacts are rebuilt when deployment-ready content changed.
- Any skipped check has a clear reason.
- Unsupported claims are absent.

## Stop And Ask

Stop for human review before:

- Deploying to live AIWikis.org.
- Changing DNS.
- Assuming WordPress admin credentials.
- Publishing raw private or archive material.
- Changing legal, privacy, security, production, or source-governance policy.
- Claiming certification, endorsement, hosted validation, automatic repository writes, automatic LLM Wiki sync, SDK, CLI, or live source-site authority.

AIWikis is allowed to be ambitious in memory quality. It is not allowed to be casual with authority.

Why This File Exists

This is a public content source file from aiwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.

Role

This file is a focused source unit. Its path, headings, and metadata give an agent a retrieval handle that is smaller than loading the entire site or repository.

Structure

The file is structured around these visible headings: Source Memory Operations Playbook; Start Of Work; Source Change Workflow; Source-Site Outcome Routing Workflow; UAIX-Specific Checks; LLMWikis-Specific Checks; Protocol5-Specific Checks; Archive Consolidation Workflow. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.

Prompt-Size And Retrieval Benefit

Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.

How To Use It

  • Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
  • LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
  • Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
  • Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.

Update Requirements

When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.

Related Pages

Provenance And History

  • Current observation: 2026-05-03T02:48:13.1276041Z
  • Source origin: current-source-workspace
  • Retrieval method: local-source-workspace
  • Duplicate group: sfg-034 (primary)
  • Historical hash records are stored in data/hashes/source-file-history.jsonl.

Machine-Readable Metadata

{
    "title":  "Source Memory Operations Playbook",
    "source_site":  "aiwikis.org",
    "source_url":  "https://aiwikis.org/",
    "canonical_url":  "https://aiwikis.org/files/aiwikis/content-pages-028-source-memory-operations-playbook-md-10e808c3/",
    "source_reference":  "content/pages/028-source-memory-operations-playbook.md",
    "file_type":  "md",
    "content_category":  "public-content",
    "content_hash":  "sha256:10e808c31ea6bc83549601d840ae4d9d7c8005af23890400170ce2b450b02bc7",
    "last_fetched":  "2026-05-03T02:48:13.1276041Z",
    "last_changed":  "2026-05-03T02:40:08.5194341Z",
    "import_status":  "changed",
    "duplicate_group_id":  "sfg-034",
    "duplicate_role":  "primary",
    "related_files":  [

                      ],
    "generated_explanation":  true,
    "explanation_last_generated":  "2026-05-03T02:48:13.1276041Z"
}