AIWikis.org Human Briefing
AIWikis.org is a WordPress publication project for dogfooding LLM Wiki, UAIX, source-governance, source-package, and agent-handoff guidance.
Metadata
| Field | Value |
|---|---|
| Source site | aiwikis.org |
| Source URL | https://aiwikis.org/ |
| Canonical AIWikis URL | https://aiwikis.org/files/aiwikis/readme-human-9264b0cb/ |
| Source reference | readme.human |
| File type | human |
| Content category | uai-system |
| Last fetched | 2026-05-03T02:48:13.1276041Z |
| Last changed | 2026-05-03T02:37:02.4268898Z |
| Content hash | sha256:9264b0cb6ffbfbc5362403bba51ccb345e9b9ad46605e3e7def734fa4d67d9c0 |
| Import status | changed |
| Raw source layer | data/sources/aiwikis/readme-human-9264b0cb6ffb.human |
| Normalized source layer | data/normalized/aiwikis/readme-human-9264b0cb6ffb.txt |
Current File Content
Structure Preview
- AIWikis.org Human Briefing
- What The AI Should Protect
- What Humans Need To Review
- How The AI Reads This Project
- Build Outputs
Raw Version
Local absolute paths are redacted in this public view. The source hash and source-side raw layer are based on the unredacted source file.
# AIWikis.org Human Briefing
AIWikis.org is a WordPress publication project for dogfooding LLM Wiki, UAIX, source-governance, source-package, and agent-handoff guidance.
The first package passes exposed two useful dogfood findings: package checks must inspect actual files such as `agent-file-handoff/` and `.uai` records, and consumer sites must not relabel source-published packages as local products. AIWikis consumes the LlmWikis `llm-wiki-starter-bundle.zip` and the UAIX `uai1-project-handoff.zip`; it should not publish `aiwikis-starter-pack.zip`. For wp-admin install convenience, `AIWikis-Publish` may mirror the UAIX-owned `uai1-project-handoff.zip` as a separate plugin artifact while preserving UAIX publisher labels.
The site should help humans and AI agents understand how to build and operate source-governed AI-readable knowledge systems. It must attribute and link back to the source projects it applies:
- LLMWikis.org for LLM Wiki handbook patterns, starter templates, trust labels, metadata, source policy, governance, and agent rules.
- UAIX.org for UAI-1, Project Handoff, AGENTS.md plus `.uai` context patterns, validator/conformance material, implementation boundaries, and public evidence surfaces.
AIWikis now also has a source-side file-native wiki skeleton: `raw/` for immutable source inputs, `wiki/` for compiled Markdown nodes, and `config/` for autonomous wiki rules and graph output. This layer is project source infrastructure, not public upload content.
AIWikis also has a repeatable recovered-content pipeline in `tools/reprocess-content.ps1`. It crawls public source/discovery/package surfaces, scans local processed archives, preserves raw `.md` and `.uai` evidence under `sources/recovered/raw/`, generates compact public summaries and indexes under `content/pages/`, and records the recovery audit trail under `reports/`.
AIWikis also has a transparent current source-file sync in `tools/sync-source-files.ps1`. It generates the `/uaix/`, `/llmwikis/`, and `/protocol5/` route split, individual current source-file pages, source-side raw and normalized layers under `data/`, hash history, generated explanations, and reports. Public generated pages redact local absolute Windows paths, while the source-side hash/history layer still tracks the unredacted current files.
For UAIX AI Memory work, that source-file view should expose the current Package Wizard as a populated operating-profile generator: source-authority policy, memory-update timing, evidence ledger, conflict-resolution policy, testing, deployment, review, risk, rollback, receiver brief, startup packet, and optional LLM Wiki memory-plan guidance.
AIWikis is also the long-term memory archive for already-processed UAIX.org, LlmWikis.org, Protocol5.com, and AIWikis archive inputs when those files have been dispositioned or explicitly accepted for long memory. The source sites still process files normally and keep their public authority boundaries. After a file is dispositioned there, AIWikis can copy it into `raw/system-archives/`, leave the source archive in place unless cleanup is explicitly requested after verification, and compile safe summaries into the AIWikis wiki, public Markdown pages, lessons, and recommendation adjustments.
The Intake Outcome Ledger at `content/pages/016-intake-outcome-ledger.md` is the public-safe feedback page for that workflow. After deployment, `/intake-outcome-ledger/` should tell people what happened to recently digested files, which surfaces changed, what was deferred or blocked, and where the reviewed documentation can be viewed.
The Roadmap at `content/pages/017-roadmap.md` is the public planning surface. It should separate current support, next work, later research, and blocked claims so humans and future agents do not have to reconstruct priorities from scattered changelog entries.
The UAIX.org, LLMWikis.org, and Protocol5.com Source Memory Guides at `content/pages/018-uaix-source-memory-guide.md`, `content/pages/019-llmwikis-source-memory-guide.md`, and `content/pages/030-protocol5-source-memory-guide.md` are the human-scale atlas pages for the source systems. They explain what each site owns, how AIWikis preserves current and archived evidence, when memory should be refreshed, and which authority claims are forbidden.
The Cross-Site Memory Atlas and Source Memory Operations Playbook at `content/pages/027-cross-site-memory-atlas.md` and `content/pages/028-source-memory-operations-playbook.md` go one layer deeper. They define claim routing, artifact ownership, evidence flow, source sync, archive consolidation, handoff export, and package-readiness workflows across UAIX.org, LLMWikis.org, Protocol5.com, and AIWikis.org.
Source-Site Outcome Routing at `content/pages/031-source-site-outcome-routing.md` closes a dogfood gap: when a source repository processes intake and creates staged drafts, that does not automatically change AIWikis.org. AIWikis needs an explicit routing pass that says what changed, what stayed staged, what public/source-side surfaces were updated, and what still needs human review or deployment.
The 2026-05-01 source-site report preservation manifest at `raw/system-archives/2026-05-01-source-site-report-preservation.json` closes the next gap: summaries and staged artifacts do not prove raw original reports survived. When AIWikis is expected to retain final memory, the source archive files need copied destinations, file counts, and checksums.
The 2026-05-03 UAIX recent-work sweep at `raw/system-archives/2026-05-03-uaix-recent-work-sweep.json` closes the follow-up gap for the last couple days of UAIX work. It preserves the current wizard setup URL, Agent File Handoff, optional LLM Wiki, roadmap, agentic-system, deploy-check, and processed-intake source state with 23 copied files and hashes. AIWikis can remember that work, but UAIX.org remains the current support authority.
The 2026-05-03 LlmWikis recent-work sweep at `raw/system-archives/2026-05-03-llmwikis-recent-work-sweep.json` closes the matching follow-up gap for the last couple days of LlmWikis work. It preserves the processed roadmap input, agentic orchestration guide, task packet template, support escalation checklist, starter-bundle registry, discovery, tests, and handoff source state with 25 copied files and hashes. AIWikis can remember that work, but LLMWikis.org remains the current handbook authority.
The Memory Coverage Matrix and Claim Boundary Register at `content/pages/029-memory-coverage-matrix.md` and `content/pages/029-claim-boundary-register.md` make the next layer explicit: what AIWikis currently covers, what is only partially covered or source-only, what is blocked, and which phrases require source-site evidence before they can appear in public or package copy.
The Deep Cognitive Archive is the local name for this slower reasoning layer: public pages stay polished, `.uai` files stay compact enough for active handoff, and the source-side wiki/archive layer keeps why, rejected alternatives, contradictions, and lineage.
AIWikis now has a local Project Handoff export loop in `tools/generate-handoff-export.ps1`. It proves, during dogfood checks, that selected reviewed wiki nodes and current `.uai` files can produce a compact handoff packet. This is local release evidence only; it is not UAIX certification, a hosted generator, or public endorsement.
Reviewed public pages under `content/pages/` are rendered directly by `aiwikis-core` in development and packaged into `aiwikis-core/data/pages` for deployment. The old WordPress Page seed/import strategy has been abandoned so document edits do not require reseeding. Raw archive reports are not rendered directly as public pages.
## What The AI Should Protect
- Source boundaries and citations.
- The distinction between source-site authority and AIWikis archive memory.
- Clear separation between current recommendations, AIWikis build observations, proposed changes, accepted changes, rejected changes, and open questions.
- Installable WordPress package roots, publish outputs, and source-package ownership labels.
- No secrets, credentials, local private paths, or raw private material in public packages.
- No unsupported claims of certification, endorsement, SDK support, CLI support, or production deployment.
- `needs-agent-review` intake files must be processed before unrelated broad work.
## What Humans Need To Review
- Public Markdown page wording, especially where raw archive material has only been summarized.
- Generated recovered-content summaries and indexes, especially where source reports were summarized instead of copied.
- Whether future source packages should be consumed exactly as published, mirrored for install convenience, or only referenced as upstream artifacts.
- Whether the staged LLMWikis integration drafts should be promoted into LLMWikis public handbook pages, summarized in AIWikis only, or held for vendor/source verification.
- WordPress admin setup, menu creation, homepage assignment, plugin activation, theme activation, and public download placement.
- Any policy, security, privacy, legal, production, architecture, or source-governance change.
- Whether a future archive pull should become a public summary page, a source-side wiki node only, or a deferred raw-memory record.
## How The AI Reads This Project
The AI should start with root `AGENTS.md`, read this file, refresh `agent-file-handoff/Content/` and `agent-file-handoff/Improvement/`, ignore `agent-file-handoff/Archive/` unless a human explicitly names an archived file, and load the `.uai` files listed in AGENTS.md before broad edits.
## Build Outputs
The package scripts write ZIP artifacts and release metadata to:
`[local path redacted]`
Why This File Exists
This is a UAI AI Memory handoff file from aiwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.
Role
readme.human is the human-facing companion to the agent entry file. It gives maintainers a plain-language briefing while leaving hard rules in AGENTS.md, constraints, and current human instructions.
Structure
The file is structured around these visible headings: AIWikis.org Human Briefing; What The AI Should Protect; What Humans Need To Review; How The AI Reads This Project; Build Outputs. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.
Prompt-Size And Retrieval Benefit
Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.
How To Use It
- Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
- LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
- Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
- Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.
Update Requirements
When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.
Related Pages
Provenance And History
- Current observation:
2026-05-03T02:48:13.1276041Z - Source origin:
current-source-workspace - Retrieval method:
local-source-workspace - Duplicate group:
sfg-270(primary) - Historical hash records are stored in
data/hashes/source-file-history.jsonl.
Machine-Readable Metadata
{
"title": "AIWikis.org Human Briefing",
"source_site": "aiwikis.org",
"source_url": "https://aiwikis.org/",
"canonical_url": "https://aiwikis.org/files/aiwikis/readme-human-9264b0cb/",
"source_reference": "readme.human",
"file_type": "human",
"content_category": "uai-system",
"content_hash": "sha256:9264b0cb6ffbfbc5362403bba51ccb345e9b9ad46605e3e7def734fa4d67d9c0",
"last_fetched": "2026-05-03T02:48:13.1276041Z",
"last_changed": "2026-05-03T02:37:02.4268898Z",
"import_status": "changed",
"duplicate_group_id": "sfg-270",
"duplicate_role": "primary",
"related_files": [
],
"generated_explanation": true,
"explanation_last_generated": "2026-05-03T02:48:13.1276041Z"
}