In short: install the plugin → type /llm-wiki:setup in chat → capture in
raw/ → curate wiki/ →
/llm-wiki:build-og when you want a static viewer. From a shell, llm-wiki build-site and llm-wiki build-og are the same build (alias names).
Turn scattered source material into a maintained wiki your agent can keep improving over time — slash-command first (/llm-wiki:setup, /llm-wiki:ingest, /llm-wiki:build-og), not flag hell.
Every slash command is a curated prompt under commands/: setup, ingest, query, memory, MCP, and more. You stay in Claude Code; the repo still gives you raw/, wiki/, and schema-backed workflows when you need depth.
If I have seen further it is by standing on the shoulders of Giants.
— Isaac Newton, letter to Robert Hooke, 1675
(echoing Bernard of Chartres, nanos gigantum humeris insidentes)
Type / in chat — each commands/<name>.md file is
/llm-wiki:<name>. No need to hunt for bin/llm-wiki for daily work: use
/llm-wiki:ingest,
/llm-wiki:query,
/llm-wiki:memory,
/llm-wiki:status,
/llm-wiki:build-og, and
/llm-wiki:mcp
as often as /llm-wiki:setup or /llm-wiki:configure. Full catalog is under
Skills & commands.
First workflow (slash — easiest)
1. Setup the vault
/llm-wiki:setup
2. Ingest one source
/llm-wiki:ingest
3. Build the viewer
/llm-wiki:build-og
The prompts carry the flags and adapters for you. Prefer a terminal? Same steps map to llm-wiki setup, ingest …, and llm-wiki build-site (alias build-og, same build) — see CLI reference.
The vault keeps raw inputs separate from curated knowledge. That lets you preserve the source of truth, revise derived notes safely, and keep schema-driven rules nearby.
Files, URLs, and Hacker News threads can all enter the same workflow.
Optional connectors can extend capture without changing the vault model.
The review path helps keep prompt-injected or low-trust material from silently becoming wiki truth.
03
Static Viewer
After ingest and curation, run the build so wiki/.og/ gets wiki-data.json plus the static viewer (graph UI, search, page reader). In chat that is /llm-wiki:build-og. In the shell, llm-wiki build-site and llm-wiki build-og are the same subcommand — two names, one pipeline: build-site is the canonical spelling; build-og is an alias that matches the .og output folder and the slash command name.
LLM WikiSearch wiki…⌘KGennie
12 nodes · 18 links
Selected
concepts/graphs.md
Linked pages
[[index]][[raw/clips/example]]
# Build (slash — runs the same pipeline as build-site / build-og)
/llm-wiki:build-og
# Either name emits wiki/.og/ + wiki-data.json — pick one:
llm-wiki build-site
# llm-wiki build-og
# Optional: build + local HTTP (port from viewer.port, default 8765)
llm-wiki build-og --serve-background
llm-wiki build-og --stop-serving
# Identical flags on build-site, e.g. llm-wiki build-site --serve-background
# Foreground (Ctrl+C stops): llm-wiki build-site --serve
Static files must be served over HTTP, not file://, so graph assets and wiki data load predictably. The slash command runs the build only. From a shell, --serve / --serve-background work on bothbuild-site and build-og (same flags); --stop-serving stops the background server (pid in wiki/.og/.viewer-http.pid). Docs often show build-og next to --serve* because the name matches wiki/.og/. You can use ./scripts/serve-viewer.sh after a build if you prefer.
04
Skills & Commands
The main surface is slash commands in Claude Code — each one is a prompt backed by the same skills as the CLI. Use / for day-to-day vault work; reach for bin/llm-wiki when you want automation, scripts, or exact flags. For the full terminal reference, see CLI reference below.
Claude Code slash commands
This is the path we optimize for: prompt bodies live in commands/; invoke as /llm-wiki:<name> (hyphenated filenames stay hyphenated, e.g. graph-knowledge). After plugin changes, /reload-plugins in Claude Code. Each name below links to the markdown source on GitHub.
Vault setup & health
First run, re-wizard, or when you need to tune config or sanity-check the vault — not the same rhythm as capture and read below.
Install the plugin once, then stay in the slash menu for the loop.
Install/plugin install llm-wiki@llm-wiki-local
Setup/llm-wiki:setup
Ingest/llm-wiki:ingest
Build/llm-wiki:build-og
Core (slash first)
/llm-wiki:setup scaffolds the vault — same as llm-wiki setup.
/llm-wiki:ingest drives sources into raw/ — adapters and flags live in the prompt.
/llm-wiki:build-og emits the static reader into wiki/.og/ — same pipeline as llm-wiki build-site or llm-wiki build-og (alias).
Keep it healthy
/llm-wiki:configure or /llm-wiki:status for config and health checks.
/llm-wiki:raw-prepare and /llm-wiki:lint for raw cleanup and wiki quality.
Skills like wiki-ingest and wiki-query align with the slash flows — use CLI only when you need non-interactive scripts.
Also works with:bin/llm-wiki or python3 scripts/llm_wiki.py for CI and power users; Cursor and Codex via AGENTS.md; manual clone for plugin development; Obsidian as an optional reader for llm-wiki/wiki/.
Three ideas share the word memory here: session memory (per-chat notes under raw/memory/ when enabled), search backends for MCP and the wiki (lexical, semantic, or hybrid with reciprocal-rank fusion), and benchmark docs for long-context retrieval evaluation (LME, LoCoMo, ConvoMem)—separate from vault content.
# After memory.enabled — CLI one-liners
llm-wiki memory save --current "Pinned fact for this chat"
llm-wiki memory list
# Same benchmarks from the terminal (plugin repo)
llm-wiki benchmark suites
Session memory: opt-in via memory.enabled; start from /llm-wiki:memory — CLI memory {save|log|list|show|recall|prune} matches the same feature; hooks can write .current-session for --current.
MCP search:mcp.search_backend — fts5 (BM25), grep, chromadb (embeddings), or hybrid (FTS5 + Chroma + RRF; mcp.hybrid_rrf_k defaults to 60).
“Memory” in docs/memory/benchmarks/ means long-context retrieval evaluation, not the session-memory feature under raw/memory/.
CLI
Terminal & automation (CLI)
Slash commands above are the friendly path. When you need scripts, CI, or exact flags, run bin/llm-wiki or python3 scripts/llm_wiki.py from the repo root or vault. Use --help on any subcommand.
Vault & config
llm-wiki configure · -i interactive
Write or edit config.json (paths, viewer, integrations, persona).
llm-wiki setup --root .
Scaffold llm-wiki/ from templates (raw, wiki, schema, CLAUDE.md).
llm-wiki teardown
Remove generated artifacts; optional --purge to remove the vault tree.
llm-wiki validate · --wikilinks
Check vault layout; optionally fail if [[wikilinks]] point to missing pages.
See which adapters are ready, validate config, interactive setup, or store API keys for tools like Firecrawl.
Build, graphs & context
llm-wiki build-site
Canonical subcommand: write wiki-data.json and the static viewer into wiki/.og/. Respects viewer.enabled.
llm-wiki build-og
Alias for build-site — identical behavior and flags (--serve, --serve-background, --stop-serving, --if-stale, --port). The name matches the .og folder and /llm-wiki:build-og.
llm-wiki graph · --mode links|knowledge
Generate a D3 graph in .tmp/llm-wiki-graph/ (link view or cluster view).
llm-wiki graph-knowledge
Shortcut for graph --mode knowledge (connected-component clusters).