Skip to content
aiWikis.org

LlmWikis AI Instructions

This is the canonical AI-agent entry file for the LlmWikis WordPress Studio workspace.

Metadata

FieldValue
Source sitellmwikis.org
Source URLhttps://llmwikis.org/
Canonical AIWikis URLhttps://aiwikis.org/llmwikis/uai-system/files/agents-md-53dfcd73/
Source referenceAGENTS.md
File typemd
Content categoryuai-system
Last fetched2026-05-06T17:58:24.5168382Z
Last changed2026-05-06T17:46:47.2822315Z
Content hashsha256:53dfcd734ecf496b84fbe8eaa2bb4320a005b3c2f78725dd936ed8cbfb619228
Import statusunchanged
Raw source layerdata/sources/llmwikis/agents-md-53dfcd734ecf.md
Normalized source layerdata/normalized/llmwikis/agents-md-53dfcd734ecf.txt

Current File Content

Structure Preview

  • LlmWikis AI Instructions
  • Status
  • Handoff Summary
  • Workspace Coordination
  • Loaded Context
  • Required First Response
  • File Intake
  • Current State
  • Current Focus
  • Do Not Change Without Explicit Human Approval
  • Deployment Rules
  • Do Not Load By Default
  • Agent History
  • Open Questions

Raw Version

This public page shows a bounded preview of a large source file. The complete source remains in the raw and normalized source layers named in metadata, with the SHA-256 hash above for verification.

  • Source characters: 19859
  • Preview characters: 11955

Local absolute paths are redacted in this public view. The source hash and source-side raw layer are based on the unredacted source file.

---
uaix: "1.0"
type: agents
project: "LlmWikis"
created: "2026-04-26"
updated: "2026-05-06"
status: active
version: 41
---

# LlmWikis AI Instructions

## Status

This is the canonical AI-agent entry file for the LlmWikis WordPress Studio workspace.

Use this file as the front door, then read `readme.human`, refresh active intake, and load the `.uai` records below before broad planning or editing.

## Handoff Summary

LlmWikis.org is a prelaunch WordPress handbook for building and maintaining personal and team LLM Wikis: raw source preservation, compiled markdown wiki pages, compact agent rules, deterministic indexes/logs, and ingest/query/lint operations.

LlmWikis is complementary to UAIX.org. UAIX remains canonical for UAI-1, AI Memory, Project Handoff, Agent File Handoff, schemas, registries, validator behavior, roadmap, governance, and public support boundaries. LlmWikis may explain those topics as non-normative handbook or case-study material.

Keep this hot handoff bundle small. Large research, older progress history, pre-slim handoff snapshots, and deep rationale belong in AIWikis long-term memory unless the current task needs original wording. The 2026-05-01 pre-slim LlmWikis snapshot is preserved at `[local path redacted]`.

## Workspace Coordination

This repo is part of a multisite Visual Studio Code workspace. Before broad work, read `E:/JustAnIota/workspace.uai` when the human names any known site/domain, when a request spans multiple sites, or when the current shell directory and requested target differ.

Resolve the target site first. If the target is LLMWikis.org, continue with this file and the LlmWikis `.uai` bundle below. If the target is UAIX.org, AIWikis.org, JustAnIota.com, the short domain, or another registered site, switch to that site's `AGENTS.md` and load only that site's hot memory unless the task explicitly asks for cross-site source routing.

## Loaded Context

Load these files before broad work:

@uai[.uai/context.uai]
@uai[.uai/stack.uai]
@uai[.uai/architecture.uai]
@uai[.uai/public-surface.uai]
@uai[.uai/constraints.uai]
@uai[.uai/decisions.uai]
@uai[.uai/progress.uai]
@uai[.uai/operations.uai]
@uai[.uai/test-plan.uai]
@uai[.uai/style.uai]
@uai[.uai/strategy.uai]
@uai[.uai/handoff-pattern.uai]
@uai[.uai/file-handoff.uai]
@uai[.uai/intake-index.uai]

If a required `.uai` file is missing, unreadable, circular, contradictory, or resolves outside this repo, stop and report that before editing.

## Required First Response

Before broad changes, respond with:

1. A 3-5 bullet summary of LlmWikis and its relationship to UAIX.
2. The active theme, plugin surface, public route shape, and main discovery/handoff files.
3. The refreshed intake status, including dispositions for every `needs-agent-review` file.
4. The hard constraints from `.uai/constraints.uai`, especially UAIX authority boundaries, privacy, unsupported claims, and destructive-operation limits.
5. The exact files or public routes expected to change.
6. The targeted checks expected to run, and any release/package checks intentionally not running.

## File Intake

At the start of every broad work session:

1. Inspect `agent-file-handoff/Content/`.
2. Inspect `agent-file-handoff/Improvement/`.
3. Ignore `agent-file-handoff/Archive/` unless the human explicitly names an archived file or moves it back into an active bucket.
4. Refresh `.uai/intake-index.uai` with `scripts/Invoke-UaiFileIntake.ps1` when available.
5. Inspect and disposition every `needs-agent-review` file before unrelated broad work.
6. For every safe, relevant file, extract at least one actionable slice and complete a real site or system work item before archiving it. Work can be public copy, docs, tests, roadmap/progress, issue/evidence state, code, package metadata, or another accepted LlmWikis surface.
7. Record the complete intake outcome: reviewed summary/disposition, actual work completed, hot-memory update or explicit no-change reason, long-memory/archive preservation when configured or explicit `not configured`, checks run or skipped, and blockers.
8. Move processed source files to `Archive/` only after the disposition, completed work, hot-memory outcome, and long-memory/archive outcome are recorded.
9. Never execute blocked or untrusted files.

Allowed dispositions are `apply-now`, `convert-into-roadmap-progress`, `archive-as-duplicate`, `defer-with-reason`, `ask-for-clarification`, or `block-as-unsafe-or-out-of-scope`.

Indexing is not approval. Dropped files are local review inputs only, not public truth, release evidence, certification, endorsement, validator evidence, or permission to widen support claims.

Memory distribution without site or system work is a failed handoff unless every active file is unsafe, duplicate, out of scope, or truly blocked with a durable reason. Do not count copying a report into `.uai`, AIWikis, or an LLM Wiki as the project work by itself.

## Current State

- Local workspace in this editor: `[local path redacted]`.
- Runtime: WordPress Studio with SQLite; use `studio wp`, not bare `wp`.
- Active theme: `wp-content/themes/llmwikis-knowledge-theme/`.
- Structured content plugin: `wp-content/plugins/llmwikis-core/`.
- Public route shape: clean root paths such as `/start-here/`, `/architecture/`, `/operations/`, `/schema-engineering/`, `/tools/llm-wiki-setup-wizard/`, `/guides/`, `/guides/canonical-ai-memory/`, `/guides/knowledge-graphs-for-llm-wikis/`, `/guides/using-llm-wiki-with-codex/`, `/guides/llm-wiki-agentic-orchestration/`, `/guides/llm-wiki-ai-memory-project-handoff/`, `/governance/review-gated-publication-model/`, `/reports/`, `/llm-wiki-vs-ai-memory/`, and `/roadmap/`.
- Main discovery files: `llms.txt`, `robots.txt`, and `sitemap.xml`.
- Starter bundle: `/downloads/llm-wiki-starter-bundle-v2.8.0.zip`, generated from the LlmWikis starter registry and owned by LlmWikis.org for the shared release. It includes agent files such as `AGENT_INSTRUCTIONS.md`, `RETRIEVAL_GUIDE.md`, `ORCHESTRATION_RUNBOOK.md`, `TASK_PACKET_TEMPLATE.md`, `SUPPORT_ESCALATION_CHECKLIST.md`, `UPDATE_RULES.md`, `CITATION_RULES.md`, and `SAFETY_BOUNDARIES.md`.
- Setup wizard: `/tools/llm-wiki-setup-wizard/` is a shared human and visitor-AI planning route for LLM Wiki setup. It now covers new wiki, existing docs, existing-system additive updates, Agent File Handoff, Project Handoff, combined File Handoff plus LLM Wiki, skill/capability setup paths, Canonical AI Memory layer routing, single-site `.uai` handoff versus multisite `workspace.uai` routing, multi-repository Git preflight, mutable runtime artifact policy, context-budget and duplicate-file controls, root index topology for single-codebase versus multisite sub-wiki indexes, multisite LLM Wiki interaction strategy, and Knowledge Graph setup. It asks about project stage, collaboration model, workspace coordinator path, target policy, site registry, per-repository sync/merge checks, tracked generated/runtime artifacts, large-file policy, duplicate-file policy, generated-history retention policy, raw and compiled paths, index/log evidence, root index shape, sub-wiki index paths, global-only root files, entity-page and episodic-log patterns, archive targets, transfer evidence logs, source collections, source-site/shared-archive strategy, update timing, trust labels, review gates, optional UAIX AI Memory Project Handoff, optional local Agent File Handoff buckets, support escalation, knowledge graph strategy/storage model/stable IDs/claim-source spans/review states/validation/export/retrieval policy, and skill/capability boundaries. It generates a browser-only setup packet, structured setup model JSON, setup-readiness checklist, first-file guidance, first actions before ingest, direct fragment/query setup paths, local browser draft restore, and embedded visitor-AI digest; homepage keeps the structured knowledge-flow graph in the top hero and places the human/AI wizard visual cards below that hero flow, while the wizard page also exposes both visual entry cards. It does not claim repository writes, automatic LLM Wiki sync, automatic publication, hosted graph database, public graph API, public SPARQL endpoint, hosted import validation, public MCP, public write API, open editing, official UAIX generator, SDK, CLI, certification, endorsement, or conformance support.
- Deployment versioning follows the affected-version rule: WordPress ZIP filenames, theme/plugin versions, package metadata, and footer labels should move to the current system-wide version only when LlmWikis or a specific LlmWikis package actually changed. Unchanged packages keep their prior names and versions.
- Local handoff files: root `AGENTS.md`, root `readme.human`, typed `.uai` records, active file intake, and targeted checks.
- Long-memory archive: AIWikis stores processed LlmWikis archive memory and pre-slim handoff snapshots under `raw/system-archives/llmwikis/`.

## Current Focus

- Keep LlmWikis handbook-first: practical, source-aware guidance for building LLM Wikis.
- Make first-run usefulness obvious before deep theory: fast-tip entry points, concrete examples, starter-bundle paths, and clearer task/persona routing should lead readers into the deep handbook.
- Teach and dogfood the hot/cold memory split: hot Project Handoff files stay concise; cold history and raw evidence live in AIWikis.
- Keep UAIX authority boundaries clear in every UAI-1, AI Memory, Project Handoff, and validator-related page.
- Keep public support claims limited to what exists: handbook routes, discovery files, starter bundle, local handoff pattern, and targeted checks.
- Keep Knowledge Graph guidance practical and deeply supportive while source-bounded: graph views derive from reviewed pages, stable IDs, source spans, claim states, contradiction records, and deterministic lint; graph retrieval must cite, surface conflicts, and abstain when evidence is missing or blocked.
- Treat context budget as part of LLM Wiki structure: large raw files, duplicate generated pages, generated history, stale generated output, package mirrors, and large JSON reports need explicit skip/retention rules before broad agent traversal.
- Keep skill-folder, modular-capability, capability-catalog, capability-lint, and portable skill-bundle work in gated roadmap state until source-reviewed pages, templates, tests, and support boundaries exist.
- Keep live benchmarks, automated ingestion, public MCP, public editing, memberships, grants, certification, and multilingual support as planned until implemented and reviewed.

## Do Not Change Without Explicit Human Approval

- Do not present LlmWikis as the canonical UAI-1 standards host.
- Do not claim official UAIX generation, validation, SDK, CLI, certification, endorsement, or conformance support from prototype or local files.
- Do not claim live benchmark integrations, automated literature ingestion, public MCP server support, open public editing, memberships, grants, or multilingual support until implemented and reviewed.
- Do not publish private repository `AGENTS.md`, `readme.human`, `.uai/` files, intake files, archives, package archives, or WordPress admin routes in public discovery output.
- Do not attribute the LLM Wiki concept by creator name in public website copy; neutral direct source links are allowed.
- Do not treat dropped intake files as public truth or production-ready content just because they were indexed.
- Do not edit WordPress core under `wp-admin/` or `wp-includes/`.
- Do not remove `wp-content/db.php` or the SQLite must-use plugin.
- Do not use destructive filesystem or git operations unless the human explicitly asks.

## Deployment Rules

- Studio guidance lives in `STUDIO.md`.
- Use `studio site status` for the local URL and credentials.
- Use `studio site start --skip-browser` before runtime checks.

Why This File Exists

This is a UAI AI Memory handoff file from llmwikis.org. It is shown here because AIWikis.org is demonstrating the real source files that make the UAIX / LLM Wiki memory system work, not only summarizing those systems after the fact.

Role

AGENTS.md is the entry contract for agents. It tells an agent which context files to load, what authority boundaries apply, how file intake works, and which operations are out of bounds before broad edits start.

Structure

The file is structured around these visible headings: LlmWikis AI Instructions; Status; Handoff Summary; Workspace Coordination; Loaded Context; Required First Response; File Intake; Current State. Those headings are retrieval anchors: a crawler or LLM can decide whether the file is relevant before reading every line.

Prompt-Size And Retrieval Benefit

Keeping this material in a separate file reduces prompt pressure because an agent can load this exact unit only when its role, source site, category, or hash is relevant. The surrounding index pages point to it, while this page preserves the full content for audit and exact recall.

How To Use It

  • Humans should read the metadata first, then inspect the raw content when they need exact wording or provenance.
  • LLMs and agents should use the source site, category, hash, headings, and related files to decide whether this file belongs in the active prompt.
  • Crawlers should treat the AIWikis page as transparent evidence and follow the source URL/source reference for authority boundaries.
  • Future maintainers should regenerate this page whenever the source hash changes, then review the explanation if the role or structure changed.

Update Requirements

When this source file changes, update the raw source layer, normalized source layer, hash history, this rendered page, generated explanation, source-file inventory, changed-files report, and any source-section index that links to it.

Related Pages

Provenance And History

  • Current observation: 2026-05-06T17:58:24.5168382Z
  • Source origin: current-source-workspace
  • Retrieval method: local-source-workspace
  • Duplicate group: sfg-229 (primary)
  • Historical hash records are stored in data/hashes/source-file-history.jsonl.

Machine-Readable Metadata

{
    "title":  "LlmWikis AI Instructions",
    "source_site":  "llmwikis.org",
    "source_url":  "https://llmwikis.org/",
    "canonical_url":  "https://aiwikis.org/llmwikis/uai-system/files/agents-md-53dfcd73/",
    "source_reference":  "AGENTS.md",
    "file_type":  "md",
    "content_category":  "uai-system",
    "content_hash":  "sha256:53dfcd734ecf496b84fbe8eaa2bb4320a005b3c2f78725dd936ed8cbfb619228",
    "last_fetched":  "2026-05-06T17:58:24.5168382Z",
    "last_changed":  "2026-05-06T17:46:47.2822315Z",
    "import_status":  "unchanged",
    "duplicate_group_id":  "sfg-229",
    "duplicate_role":  "primary",
    "related_files":  [

                      ],
    "generated_explanation":  true,
    "explanation_last_generated":  "2026-05-06T17:58:24.5168382Z"
}