Skip to content
aiWikis.org

Operations

Run:

Metadata

FieldValue
Source siteaiwikis.org
Source URLhttps://aiwikis.org/
Canonical AIWikis URLhttps://aiwikis.org/files/aiwikis/uai-operations-uai-2d14c83d/
Source reference.uai/operations.uai
File typeuai
Content categoryuai-system
Last fetched2026-05-03T02:48:13.1276041Z
Last changed2026-05-03T02:37:49.3535159Z
Content hashsha256:2d14c83d530e4395f7c62a311129b37df7897c97c69fd4843e902fbad5a7e54f
Import statuschanged
Raw source layerdata/sources/aiwikis/uai-operations-uai-2d14c83d530e.uai
Normalized source layerdata/normalized/aiwikis/uai-operations-uai-2d14c83d530e.txt

Current File Content

Structure Preview

  • Operations
  • Routine Intake
  • Cross-Site Archive Pull
  • Source-Site Outcome Routing
  • Deep Cognitive Archive Work
  • Intake Outcome Ledger Work
  • Roadmap Work
  • Source Memory Guide Work
  • Cross-Site Atlas And Playbook Work
  • Coverage And Claim Boundary Work
  • Source Refresh
  • Recovered Content Reprocessing
  • Transparent Source File Sync
  • Wiki Lint
  • Project Handoff Export
  • Package Build
  • Deployment Boundary

Raw Version

---
uai: "1.0"
type: operations
project: "AIWikis.org"
updated: "2026-05-01"
status: active
---

# Operations

## Routine Intake

Run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File scripts/Invoke-UaiFileIntake.ps1
```

Review every `needs-agent-review` file before unrelated broad work. Move already-dispositioned source files to `agent-file-handoff/Archive/`.

## Cross-Site Archive Pull

When the human asks AIWikis to absorb processed UAIX.org, LlmWikis.org, or Protocol5.com archive material:

1. Confirm the source archive path resolves under the source site's `agent-file-handoff/Archive/`.
2. Copy source-site archive files into `raw/system-archives/{site}/agent-file-handoff/Archive/` or a dated preservation folder without overwriting existing files.
3. Write or update a transfer manifest under `raw/system-archives/` with source paths, destination paths, file counts, and checksums.
4. Compile public-safe summaries into `wiki/`, `content/pages/`, lessons, recommendation adjustments, and `config/graph.jsonld`.
5. Add or update an Intake Outcome Ledger entry that names what was read, what changed, what was deferred or blocked, where the raw file lives, and where the reviewed documentation can be viewed.
6. Refresh AIWikis intake so active buckets return to 0 files.
7. Leave source-site archive copies in place unless the human explicitly asks for source-side cleanup after transfer evidence and discoverability are verified.

This is a source-side memory operation. It does not make AIWikis the canonical source for UAIX or LlmWikis public claims.

## Source-Site Outcome Routing

When UAIX.org, LLMWikis.org, Protocol5.com, or another source repository processes intake and the human expects AIWikis visibility:

1. Confirm the source-site result: active report, staged draft, reviewed edit, archive transfer, package output, or deployment claim.
2. Preserve the boundary between source-site staged work and AIWikis public truth.
3. Check whether original processed source files have a raw AIWikis manifest before relying on staged artifacts, hot-context summaries, or public-safe summaries.
4. Check whether current implementation, discovery, starter, test, package, route, or support-boundary files changed after the original reports were preserved. If so, create a current-source digest with copied files and checksums before calling AIWikis memory complete.
5. Update `content/pages/016-intake-outcome-ledger.md` or `content/pages/031-source-site-outcome-routing.md` with a public-safe outcome.
6. Update the relevant source guide, Cross-Site Memory Atlas, Source Memory Operations Playbook, Memory Coverage Matrix, Claim Boundary Register, Roadmap, lessons, recommendation adjustments, changelog, `.uai` state, and wiki memory when durable operating truth changed.
7. Copy staged, archived, or current-source artifacts into `raw/system-archives/` only when explicitly accepted for AIWikis long memory.
8. Run source sync, wiki lint, handoff export, smoke, and package checks according to the files changed.
9. If AIWikis did not change, record that fact in the completion report so the gap is visible.

This routing step is required for source-site staged outcomes. It does not create automatic publication, automatic repository writes, automatic LLM Wiki sync, hosted validation, public MCP, SDK, CLI, certification, endorsement, deployment, or source-site authority.

## Deep Cognitive Archive Work

When an intake file changes the reasoning model rather than just one public page:

1. Preserve the original in `raw/system-archives/` or `raw/`.
2. Move the active intake file to `agent-file-handoff/Archive/` after disposition.
3. Add or update a `wiki/sources/` proxy for source summary and support boundaries.
4. Add or update a `wiki/concepts/` node for stable conclusions, rejected alternatives, contradictions, and modelling insufficiency.
5. Update `wiki/index.md`, `config/graph.jsonld`, public-safe `content/pages/` summaries, lessons, recommendation adjustments, and changelog.
6. Promote only compact active truth into `.uai` files; do not let `.uai` become the full archive.

## Intake Outcome Ledger Work

When a dropped file is processed into long-term memory:

1. Confirm the active source file has an explicit disposition.
2. Preserve raw transfer evidence and checksums.
3. Add the public-safe result to `content/pages/016-intake-outcome-ledger.md` or a reviewed successor page.
4. Add or update source/concept/log wiki nodes for durable retrieval.
5. Keep automatic sync, hosted import, certification, endorsement, and source-site authority claims out of current copy unless source evidence exists.

## Roadmap Work

When roadmap truth changes:

1. Update `content/pages/017-roadmap.md`.
2. Align `.uai/progress.uai`, `.uai/decisions.uai`, lessons, recommendation adjustments, and `docs/content-map.md`.
3. Keep current, next, later, and blocked claims separate.
4. Preserve UAIX and LLMWikis source authority boundaries.
5. Run targeted smoke checks so the page remains packaged and discoverable.

## Source Memory Guide Work

When UAIX.org, LLMWikis.org, or Protocol5.com source-system truth changes:

1. Update `content/pages/018-uaix-source-memory-guide.md`, `content/pages/019-llmwikis-source-memory-guide.md`, or `content/pages/030-protocol5-source-memory-guide.md`.
2. Preserve source authority: UAIX.org stays canonical for UAI-1 and AI Memory; LLMWikis.org stays canonical for LLM Wiki handbook guidance; Protocol5.com stays the source for Protocol5 parent-brand, Mathematics, UAI .NET Hub, route-contract, and package-distribution claims.
3. Align home, Start Here, Roadmap, docs, `.uai/progress.uai`, `.uai/decisions.uai`, lessons, and recommendation adjustments when the guide changes durable memory.
4. Run `tools/sync-source-files.ps1 -DryRun`, then `tools/sync-source-files.ps1` when source-file evidence should be refreshed.
5. Run `tools/smoke-test.ps1` so the source memory guides remain package-visible and boundary-preserving.

## Cross-Site Atlas And Playbook Work

When cross-site operating truth changes:

1. Update `content/pages/027-cross-site-memory-atlas.md` for authority routing, artifact ownership, evidence flow, memory layers, and claim-routing changes.
2. Update `content/pages/028-source-memory-operations-playbook.md` for intake, source sync, archive consolidation, handoff export, packaging, or stop-condition changes.
3. Align Home, Start Here, Roadmap, lessons, recommendation adjustments, docs, `.uai/progress.uai`, and `.uai/decisions.uai`.
4. Keep UAIX.org, LLMWikis.org, and Protocol5.com source authority boundaries explicit.
5. Run `tools/smoke-test.ps1`, and run `tools/package.ps1` when the change should refresh deployable artifacts.

## Coverage And Claim Boundary Work

When memory coverage or support-claim state changes:

1. Update `content/pages/029-memory-coverage-matrix.md` for covered, partially covered, source-only, and blocked domains.
2. Update `content/pages/029-claim-boundary-register.md` for safe, source-routed, dogfood, planned, and blocked claims.
3. Align Roadmap, Source Memory Operations Playbook, `.uai/progress.uai`, `.uai/decisions.uai`, lessons, recommendation adjustments, and docs.
4. Preserve the rule that coverage does not equal source authority.
5. Run `tools/smoke-test.ps1`, and run `tools/package.ps1` when package-visible copy or release evidence should refresh.

## Source Refresh

Run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/fetch-sources.ps1
```

Public source fetch failures should be recorded in `sources/SOURCE_MANIFEST.json` instead of breaking the whole build.

## Recovered Content Reprocessing

Run a dry run first:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/reprocess-content.ps1 -DryRun
```

Then run the import when the plan is acceptable:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/reprocess-content.ps1
```

The script preserves raw recovered inputs under `sources/recovered/raw/`, generates compact public summary pages only for recovered groups, refreshes recovery reports under `reports/`, and avoids overwriting hand-authored pages unless they carry the generator marker.

## Transparent Source File Sync

Run a dry run first:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/sync-source-files.ps1 -DryRun
```

Then run the import when the plan is acceptable:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/sync-source-files.ps1
```

The script scans the current UAIX.org, LLMWikis.org, Protocol5.com, and AIWikis.org source workspaces plus source-package mirrors, preserves exact source-side files under `data/sources/`, writes normalized comparison text under `data/normalized/`, appends observed hashes to `data/hashes/source-file-history.jsonl`, generates focused file pages under `/uaix/uai-system/files/...`, `/llmwikis/uai-system/files/...`, `/protocol5/uai-system/files/...`, and `/files/...`, and refreshes source-file reports under `reports/`. After UAIX AI Memory Wizard changes, confirm the refreshed UAIX pages still expose generated system profile, receiver brief, startup packet, optional LLM Wiki memory-plan, and protocol-output language without turning AIWikis into the authority. After Protocol5 changes, confirm the refreshed pages preserve route contracts, package artifacts, .NET implementation boundaries, and the rule that UAIX.org remains normative for UAI-1.

Public generated pages redact local absolute Windows paths while preserving unredacted source-side hashes and raw files outside public HTML.

## Wiki Lint

Run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/wiki-lint.ps1
```

This checks the source-side `wiki/` layer for required frontmatter, hard and soft line caps, broken wikilinks, orphan pages, and parseable `config/graph.jsonld`.

## Project Handoff Export

Run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/generate-handoff-export.ps1
```

This builds `dist/handoff/aiwikis-project-handoff/` and `dist/handoff/aiwikis-project-handoff.zip` from root handoff files, `.uai` state, recommendation docs, active intake scaffolding, and selected reviewed wiki nodes marked `handoff_export: include`. It is local dogfood evidence, not a public hosted generator or certification claim.

## Package Build

Run:

```powershell
powershell -NoProfile -ExecutionPolicy Bypass -File tools/package.ps1
```

This writes ZIP artifacts, `release-manifest.json`, and `SHA256SUMS.txt` to the publish folder.

`aiwikis-core.zip` includes reviewed Markdown pages under `data/pages`. The plugin renders those files directly at public routes, so deployment no longer requires WordPress Page seeding, activation imports, uploaded-plugin repair, or a Tools > AIWikis import step.

Package builds copy the generated local handoff export into release evidence so recommendation implementation can be audited alongside package contents.

## Deployment Boundary

Do not deploy, change DNS, assume WordPress credentials, or overwrite production content from this workspace.

Why This File Exists

This is a UAI AI Memory handoff file from aiwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.

Role

This .uai file is a compact operations packet. It keeps one kind of durable project truth separate from the rest of the archive so an agent can load the topic it needs without pulling the whole project history into prompt context.

Structure

The file is structured around these visible headings: Operations; Routine Intake; Cross-Site Archive Pull; Source-Site Outcome Routing; Deep Cognitive Archive Work; Intake Outcome Ledger Work; Roadmap Work; Source Memory Guide Work. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.

Prompt-Size And Retrieval Benefit

Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.

How To Use It

  • Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
  • LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
  • Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
  • Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.

Update Requirements

When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.

Related Pages

Provenance And History

  • Current observation: 2026-05-03T02:48:13.1276041Z
  • Source origin: current-source-workspace
  • Retrieval method: local-source-workspace
  • Duplicate group: sfg-074 (primary)
  • Historical hash records are stored in data/hashes/source-file-history.jsonl.

Machine-Readable Metadata

{
    "title":  "Operations",
    "source_site":  "aiwikis.org",
    "source_url":  "https://aiwikis.org/",
    "canonical_url":  "https://aiwikis.org/files/aiwikis/uai-operations-uai-2d14c83d/",
    "source_reference":  ".uai/operations.uai",
    "file_type":  "uai",
    "content_category":  "uai-system",
    "content_hash":  "sha256:2d14c83d530e4395f7c62a311129b37df7897c97c69fd4843e902fbad5a7e54f",
    "last_fetched":  "2026-05-03T02:48:13.1276041Z",
    "last_changed":  "2026-05-03T02:37:49.3535159Z",
    "import_status":  "changed",
    "duplicate_group_id":  "sfg-074",
    "duplicate_role":  "primary",
    "related_files":  [

                      ],
    "generated_explanation":  true,
    "explanation_last_generated":  "2026-05-03T02:48:13.1276041Z"
}