echidna-ai-doc
v1.0.3
Published
Generate comprehensive documentation for codebases using a local LLM.
Downloads
10
Maintainers
Readme
echidna-ai-doc
Instant, private, AI-authored documentation for every codebase.
Powered entirely on your hardware—no tokens, no latency, no data leakage.
Why this exists
Software moves faster than humans can write. The gap between what runs and what is understood widens every sprint.echidna-ai-doc closes the gap: a single command spins up a local language model, crawls your repository, and emits clean, pragmatic Markdown that anyone can read—onboarding, audits, or open-source releases.
We ship a 726 MB quantised model (smollm2:360m) that fits comfortably on a laptop yet punches well above its weight in code understanding. Run offline, commit the docs, ship with confidence.
Key capabilities
| Feature | Dev mode (-d) | Prod mode (-p) |
|---------|-----------------|------------------|
| In-depth API & source commentary | ✔ | – |
| End-user / SDK documentation | – | ✔ |
| Custom prompt override (--prompt-file) | ✔ | ✔ |
| Single-file multi-model quality test (--file --compare) | ✔ | – |
| Generates one docs.md file for effortless publishing | ✔ | ✔ |
Installation
# 1. Local LLM runtime (Ollama)
curl -fsSL https://ollama.ai/install.sh | sh
# 2. Pull the default model (one-time, 726 MB)
ollama pull smollm2:360m
# 3. Add echidna-ai-doc to your toolchain
npm i -D echidna-ai-docOne-minute tour
Developer documentation
npx echidna-ai-doc -d --out docs-dev
open docs-dev/docs.mdProduction / user docs
npx echidna-ai-doc -p --out docs-prodEvaluate model quality on a single file
npx echidna-ai-doc \
--file src/index.ts \
--compare \
--out compare-docscompare-docs/ will contain three versions—from 135 M, 360 M, and 4 B parameter models—so you can eyeball the trade-offs.
Tailor the voice
cat > prompt.txt << 'TXT'
You are a senior developer advocate. Produce **concise**, task-oriented docs with code blocks.
TXT
npx echidna-ai-doc -p --prompt-file prompt.txt --out concise-docsCLI reference
Usage: echidna-ai-doc [options]
Options
-d, --dev Developer-oriented docs (default)
-p, --prod Product / user-facing docs
--file <path> Limit run to a single file
--compare With --file, run all bundled models for side-by-side quality
--prompt-file <path> Custom prompt text/markdown
-b, --batch <n> Parallel files (default: 3)
-o, --out <dir> Output directory (default: docs)
-t, --temperature <n> LLM creativity (default: 0.1)
-n, --maxTokens <n> Generation cap (default: 512)
--ollama-url <url> Remote Ollama instance (default: localhost)
-h, --help Show helpRoadmap
- Embedding cache for true incremental runs.
- OpenAPI emission—turn detected routes into machine-readable specs.
- VS Code extension for one-click per-file docs.
License
MIT – do what you want, give back improvements.
Built with patience, caffeine, and a relentless belief that clear knowledge should be automatic.
