npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

wikimd-cli

v1.3.0

Published

Building personal knowledge bases with LLMs. A pattern for incrementally building and maintaining a persistent wiki through AI agent collaboration.

Readme

LLM Wiki

Building personal knowledge bases with LLMs. A pattern for incrementally building and maintaining a persistent wiki through AI agent collaboration.

Status: Actively used | Pattern: Proven | Last updated: 2026-04-08

What is LLM Wiki?

Instead of uploading documents once and asking questions (RAG), an LLM Wiki persistently builds and maintains a structured markdown wiki as you feed it sources. Knowledge compounds over time.

  • One-time synthesis — Each source is read once, integrated into the wiki
  • Persistent artifact — The wiki grows richer with every source and query
  • Agent-maintained — LLM handles all cross-references, updates, consistency
  • Your job — Curate sources, ask questions, guide the analysis

Think of it as: Obsidian (IDE) + LLM (programmer) + Wiki (codebase)

How It Works

Raw Sources (immutable)
         ↓ (ingest)
    LLM Wiki (persistent markdown files)
         ↓ (query + synthesis)
    Knowledge + Answers (feed back into wiki)

Three Layers

  1. Raw Sources — Articles, papers, reports. You curate; LLM reads.
  2. The Wiki — LLM-generated markdown. Summaries, entity pages, cross-references.
  3. The Schema — CLAUDE.md or AGENTS.md. Tells the LLM how to maintain the wiki.

Operations

Ingest

You add a source → LLM reads it → Updates wiki (10-15 pages touched) → Appends to log

# Step 1: Add source to raw/
cp article.pdf raw/sources/article.pdf

# Step 2: Tell your LLM agent to ingest
# Agent processes: reads, extracts, links, updates indices

Query

You ask questions → LLM searches wiki → Synthesizes answer → Files back into wiki

# Step 3: Ask a question
# "What are the main themes across all sources about X?"

# Step 4: Good answers become wiki pages
# Answer → wiki/themes/x-synthesis.md

Lint

Periodically audit the wiki for contradictions, orphans, gaps.

# Health checks
# - Orphan pages (no inbound links)
# - Contradictions between pages
# - Missing cross-references
# - Data gaps

Wiki Structure

After setup, your wiki looks like this:

my-wiki/
├── raw/                    # Source documents (immutable)
│   ├── sources/           # Articles, PDFs, reports
│   └── assets/            # Downloaded images
├── wiki/                  # LLM-generated knowledge
│   ├── entities/          # People, places, organizations
│   ├── concepts/          # Ideas, frameworks, techniques
│   ├── topics/            # Domain-specific summaries
│   ├── comparisons/       # Analysis across sources
│   └── synthesis/         # Integrated views
├── index.md               # Catalog (all pages with summaries)
├── log.md                 # Chronological record (ingest, query, lint)
├── CLAUDE.md              # Agent instructions
├── CONVENTIONS.md         # Writing rules & frontmatter format
└── README.md              # Your wiki's purpose

3-Minute Quick Start

npm install -g wikimd-cli
wikimd init my-wiki && cd my-wiki
# Choose a schema: copy CLAUDE-{personal,reading,team,research,competitive}.md
wikimd ingest ~/Downloads/article.pdf
# Share the checklist with your LLM agent

Full tutorial: QUICKSTART.md (3 minutes)

Using CLI vs Manual Setup

Option 1: Using CLI (Recommended)

npm install -g wikimd-cli
wikimd init my-wiki
cd my-wiki
# Edit CLAUDE.md to choose a schema
# Start ingesting sources: wikimd ingest <file>

Option 2: Manual Setup

mkdir -p my-wiki/{raw/{sources,assets},wiki/{entities,concepts,topics,comparisons,synthesis}}
cd my-wiki
git init

2. Write Your Schema (CLAUDE.md)

This tells your LLM agent how to maintain the wiki. Example:

# My Wiki — LLM Agent Configuration

## Frontmatter Convention
All pages use YAML frontmatter:
- title, type, tags, created, updated, status
- source_count, key_claims, contradictions

## Page Types
- entity: Person, place, organization
- concept: Idea, technique, framework
- topic: Domain summary
- synthesis: Integrated analysis

## Ingest Workflow
1. Read source (extract key claims)
2. Create/update 5-10 wiki pages
3. Update index.md
4. Log entry to log.md

## Query Workflow
1. Search wiki for relevant pages
2. Synthesize answer
3. File answer back as wiki page if valuable

3. Add Your First Source

# Place a document
cp ~/Downloads/article.pdf raw/sources/

# Ask your LLM to ingest it
# Agent reads → creates entity/concept/synthesis pages → updates indices

4. Query and Explore

# Ask questions that span multiple sources
# Agent synthesizes from wiki, not re-reading raw sources
# Good answers become new wiki pages

Example Use Cases

Personal Knowledge
Track goals, health insights, self-improvement. File journal entries, articles, podcast notes. Build a structured picture of yourself.

Research Deep-Dive
Read 50 papers on a topic over 3 months. Wiki accumulates the synthesis automatically. By month 3, every new paper's claims are integrated against existing knowledge.

Book Companion
As you read a novel, agent creates pages for characters, themes, plot threads, settings. By the end: a rich companion wiki.

Team Knowledge
Internal wiki fed by Slack threads, meeting transcripts, customer calls. Stays current because LLM does maintenance.

Competitive Analysis
Track competitors, market trends, strategic moves over time. Wiki is always current.

CLI Feature Status

The official CLI (wikimd-cli) has two tiers of features:

✅ Stable (Production Ready)

  • wikimd init — Initialize new wiki
  • wikimd ingest — Add sources and generate checklists

These are production-ready and covered by semantic versioning. Use them confidently.

⚠️ Experimental (Use at Your Own Risk)

  • wikimd search — Search pages (use grep as alternative)
  • wikimd lint — Check wiki health (use git as alternative)
  • wikimd sync — Rebuild indices (manual update is fine)

These are useful but subject to change. Your feedback helps us decide their future. See STABILITY.md for details.


Documentation

Start Here (First-Time Users)

| Document | Time | Topic | |----------|------|-------| | QUICKSTART.md | 3 min | Get your wiki running in 3 minutes | | COMMAND-REFERENCE.md | 5 min | Command cheat sheet and options |

Core Documentation

| Document | Topic | |----------|-------| | docs/FAQ.md | 25+ common questions and answers | | docs/API-REFERENCE.md | Complete schema customization guide | | docs/BEST-PRACTICES.md | 7 proven patterns for wiki maintenance | | docs/INTEGRATION-GUIDES.md | Integration with Obsidian, GitHub, Claude Code, Slack, etc. |

Reference

| Document | Topic | |----------|-------| | STABILITY.md | API stability and feature maturity |

Tips & Tricks

Indexing

Maintain two special files:

  • index.md — Content-oriented catalog (link + one-line summary per page)
  • log.md — Chronological record (append-only, one entry per ingest/query/lint)

At small scale, index.md is enough. As you grow, add a search tool:

  • qmd — Local search (BM25 + vector + LLM re-ranking)
  • Custom script — Simple markdown grep

Optional: CLI Tools

Build helpers as needed (LLM can help you vibe-code them):

# Example: Simple markdown search
alias wiki-search='grep -r "$1" wiki/ | head -20'

# Example: Rebuild indices
alias wiki-sync='echo "Rebuilt index and log" # trigger LLM to do this'

Obsidian Workflow

  1. Open your wiki vault in Obsidian
  2. Start your LLM agent in wiki directory
  3. Agent edits markdown files
  4. You browse results in Obsidian real-time:
    • Follow [[wikilinks]]
    • Check graph view (which pages are hubs?)
    • Read updated synthesis pages

Web Clipper

Obsidian Web Clipper browser extension → quickly clip articles to raw/sources/.

Git Versioning

Your wiki is a git repo of markdown files. You get version history free.

git log --oneline wiki/   # See how pages evolved
git diff wiki/synthesis/  # What changed in synthesis?

Common Patterns

Pattern: Persistent Synthesis

Instead of asking the same question repeatedly:

❌ Every time: "What are the key themes?"
   LLM re-reads all sources, synthesizes from scratch

✅ Once: "What are the key themes?" → wiki/synthesis/themes.md
   Future queries: "Add this new source's themes to themes.md"
   LLM integrates new claims against existing synthesis

Pattern: Maintaining Consistency

As the wiki grows, contradictions emerge. The LLM flags them in a contradiction matrix:

# Contradictions

| Claim | Source A | Source B | Resolution |
|-------|----------|----------|-----------|
| X causes Y | Claims yes (p.5) | Claims no (p.12) | Need more sources |

Pattern: Evolution of Ideas

Track how your understanding changes:

v1 (Source 1): "X is caused by Y"
v2 (Source 2): "Actually, X is caused by Z"
v3 (Source 3): "X is caused by both Y and Z, under different conditions"
→ This evolution lives in git history and synthesis pages

Why This Works

The tedious part of knowledge management is bookkeeping:

  • Updating cross-references (15+ pages affected by one change)
  • Keeping summaries current
  • Noting contradictions
  • Maintaining consistency across 50+ pages

Humans abandon this. Maintenance overhead grows faster than value.

LLMs don't get bored. They touch 15 files in one pass, remember every contradiction, and maintain perfect consistency.

Your job: curate sources, ask good questions, think about meaning.
LLM's job: everything else.

Philosophy

This pattern echoes Vannevar Bush's Memex (1945) — a personal, curated knowledge store with associative trails between documents. Bush imagined the power of interlinked knowledge but couldn't solve the maintenance burden.

LLMs handle that.

Contributing

Have a pattern that works? Wiki variant? Tooling? Share it.

See CONTRIBUTING.md.

License

MIT. Use and adapt freely.

Schema Examples

Pick a schema that matches your use case. Each includes a full CLAUDE.md template, workflows, best practices, and example pages.

Available Schemas

| Use Case | File | Best For | |----------|------|----------| | Research Deep-Dive | CLAUDE-research.md | Reading 50+ papers, competitive analysis, strategic research | | Personal Knowledge | CLAUDE-personal.md | Goals, health, psychology, self-improvement, journaling | | Reading Companion | CLAUDE-reading.md | Fiction novels, character tracking, theme analysis | | Team Knowledge | CLAUDE-team.md | Team wiki from Slack, meetings, customer calls | | Competitive Intelligence | CLAUDE-competitive.md | Tracking competitors, market trends, strategic moves |

How to Use an Example

  1. Choose a schema that matches your domain
  2. Copy the entire file into your wiki directory as CLAUDE.md
  3. Customize the directory structure and frontmatter for your needs
  4. Share the schema with your LLM agent (Claude Code, Cursor, etc.)
  5. Start ingesting sources and building your wiki

Further Reading

  • llm-wiki.md — Full pattern specification (copy this into your own LLM agent)
  • examples/ — Sample schemas and wiki structures for different domains
  • CONTRIBUTING.md — How to share your own patterns and schemas

Built to work with any LLM: Claude Code, Cursor, Copilot, Cline, ChatGPT, or any CLI-based agent.