npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

echidna-ai-doc

v1.0.3

Published

Generate comprehensive documentation for codebases using a local LLM.

Downloads

10

Readme

echidna-ai-doc

Instant, private, AI-authored documentation for every codebase.

Powered entirely on your hardware—no tokens, no latency, no data leakage.

npm license build


Why this exists

Software moves faster than humans can write. The gap between what runs and what is understood widens every sprint.
echidna-ai-doc closes the gap: a single command spins up a local language model, crawls your repository, and emits clean, pragmatic Markdown that anyone can read—onboarding, audits, or open-source releases.

We ship a 726 MB quantised model (smollm2:360m) that fits comfortably on a laptop yet punches well above its weight in code understanding. Run offline, commit the docs, ship with confidence.


Key capabilities

| Feature | Dev mode (-d) | Prod mode (-p) | |---------|-----------------|------------------| | In-depth API & source commentary | ✔ | – | | End-user / SDK documentation | – | ✔ | | Custom prompt override (--prompt-file) | ✔ | ✔ | | Single-file multi-model quality test (--file --compare) | ✔ | – | | Generates one docs.md file for effortless publishing | ✔ | ✔ |


Installation

# 1. Local LLM runtime (Ollama)
curl -fsSL https://ollama.ai/install.sh | sh

# 2. Pull the default model (one-time, 726 MB)
ollama pull smollm2:360m

# 3. Add echidna-ai-doc to your toolchain
npm i -D echidna-ai-doc

One-minute tour

Developer documentation

npx echidna-ai-doc -d --out docs-dev
open docs-dev/docs.md

Production / user docs

npx echidna-ai-doc -p --out docs-prod

Evaluate model quality on a single file

npx echidna-ai-doc \
  --file src/index.ts \
  --compare \
  --out compare-docs

compare-docs/ will contain three versions—from 135 M, 360 M, and 4 B parameter models—so you can eyeball the trade-offs.

Tailor the voice

cat > prompt.txt << 'TXT'
You are a senior developer advocate. Produce **concise**, task-oriented docs with code blocks.
TXT

npx echidna-ai-doc -p --prompt-file prompt.txt --out concise-docs

CLI reference

Usage: echidna-ai-doc [options]

Options
  -d, --dev                 Developer-oriented docs (default)
  -p, --prod                Product / user-facing docs
  --file <path>             Limit run to a single file
  --compare                 With --file, run all bundled models for side-by-side quality
  --prompt-file <path>      Custom prompt text/markdown
  -b, --batch <n>           Parallel files (default: 3)
  -o, --out <dir>           Output directory (default: docs)
  -t, --temperature <n>     LLM creativity (default: 0.1)
  -n, --maxTokens <n>       Generation cap (default: 512)
  --ollama-url <url>        Remote Ollama instance (default: localhost)
  -h, --help                Show help

Roadmap

  1. Embedding cache for true incremental runs.
  2. OpenAPI emission—turn detected routes into machine-readable specs.
  3. VS Code extension for one-click per-file docs.

License

MIT – do what you want, give back improvements.

Built with patience, caffeine, and a relentless belief that clear knowledge should be automatic.