backmap
v0.4.0
Published
Cognitive infrastructure for AI-augmented development — maps your entire API surface for you and your AI
Downloads
885
Maintainers
Readme
backmap
Open-source cognitive infrastructure for AI-augmented development. backmap scans your backend source code using static analysis and builds a living, queryable map of your entire API surface — no annotations or OpenAPI specs required.
As AI coding assistants write more code, developers lose understanding of their own systems. backmap keeps you in control with real-time API maps, relationship graphs, usage tracking, and instant AI context via MCP.
→ backmap.dev — docs, dashboard, and community
Install
npm install -g backmapOr use directly with npx:
npx backmap scan ./my-backendRequires Node.js 18+.
Quick Start
# Scan a project (auto-detects framework)
backmap scan ./backend
# Export to OpenAPI + Markdown
backmap scan ./backend --format openapi,markdown -o docs/
# Save scan and browse locally
backmap scan ./backend --save
backmap serve
# Give your AI assistant instant API context
backmap mcpbackmap scan now writes only to the explicit output destination (--output or docs/ by default). No implicit duplicate artifacts are written unless you pass --artifact-dir.
Features
backmap is fully open-source and free. Every feature is available to everyone:
- Scan all 13 frameworks
- Local explorer (
backmap serve) - Export (OpenAPI, Markdown, Postman)
- Diff & usage scanning
- MCP server for AI assistants
- Deep endpoint analysis
- Schema graph
- Scan history
Commands
| Command | Description |
|---|---|
| backmap scan <dir> | Scan a project and build the API map |
| backmap serve | Start the local system map explorer UI |
| backmap export | Re-export from a saved scan |
| backmap diff <a> <b> | Compare two scan results |
| backmap learn <dir> | Discover custom HTTP call patterns via LLM |
| backmap mcp | Start MCP server for AI coding assistants |
| backmap analyze <endpoint> | Deep LLM-powered analysis of a single endpoint |
| backmap truth | Compare scanner results against AI ground truth |
Supported Frameworks
backmap ships 13 framework parsers with automatic detection:
| Framework | Language | Detection |
|---|---|---|
| Express | TypeScript / JavaScript | package.json → express |
| NestJS | TypeScript | package.json → @nestjs/common |
| Next.js | TypeScript / JavaScript | package.json → next |
| Nitro / H3 | TypeScript | package.json → nitro or h3 |
| SvelteKit | TypeScript / JavaScript | package.json → @sveltejs/kit — parses +server.ts REST endpoints and +page.server.ts form actions |
| Spring Boot | Kotlin / Java | build.gradle / pom.xml → spring-boot |
| FastAPI | Python | requirements.txt / pyproject.toml → fastapi |
| Django REST | Python | dependency files → djangorestframework |
| Gin | Go | go.mod → gin-gonic/gin |
| Rails | Ruby | Gemfile → rails |
| Laravel | PHP | composer.json → laravel/framework |
| ASP.NET Core | C# | .csproj → Microsoft.AspNetCore |
| Generic | Any | Fallback regex-based route detection |
Key Flags
backmap scan
| Flag | Description |
|---|---|
| --framework <id> | Force a specific framework parser |
| --format <fmt> | Output format: openapi, markdown, postman (comma-separated) |
| -o, --output <path> | Output file or directory |
| --artifact-dir <path> | Optional local artifact copy (scan-result.json + exports) |
| --incremental | Only re-scan changed files |
| --save | Save scan result to .backmap/scans/ |
| --enrich | Enrich with LLM-generated descriptions |
| --enrich-provider | LLM provider: anthropic or mistral |
| --usage-scan <dir> | Scan frontend code for endpoint usage |
| --progress | Emit JSON progress events to stderr |
backmap serve
| Flag | Description |
|---|---|
| --port <number> | Server port (default: 4400) |
| --dir <path> | Project directory to serve |
backmap mcp
| Flag | Description |
|---|---|
| --dir <path> | Project directory |
| --transport <type> | stdio (default) or sse |
| --port <number> | Port for SSE transport |
backmap analyze
| Flag | Description |
|---|---|
| --dir <path> | Project directory |
| --refresh | Force re-analysis (ignore cache) |
MCP Server
The MCP server gives AI coding assistants (Claude Code, Cursor, Windsurf, etc.) instant, structured access to your API surface. Instead of re-reading every controller file, your assistant gets accurate endpoint data in milliseconds.
Setup
// Claude Code: ~/.claude.json or project .mcp.json
{
"mcpServers": {
"backmap": {
"command": "npx",
"args": ["backmap", "mcp", "--dir", "/path/to/project"]
}
}
}Available Tools
| Tool | Description |
|---|---|
| list_endpoints | List all endpoints with optional method/tag/auth/usage filters |
| get_endpoint | Full endpoint detail: params, schemas, examples, usages |
| search_endpoints | Full-text search across paths, summaries, descriptions |
| list_schemas | List all DTOs/schemas |
| get_schema | Full schema with properties and JSON Schema |
| get_endpoint_usages | Frontend references for an endpoint |
| get_dead_endpoints | Endpoints with zero frontend usages |
| get_scan_diff | Changes between scans |
| get_project_summary | Project stats overview |
| get_unmatched_calls | Frontend calls that couldn't be matched to any endpoint |
| get_schema_graph | Schema dependency graph with blast radius |
Deep Analysis
Perform LLM-powered analysis of individual endpoints. When you first run this command, you'll be prompted to select a provider and enter your API key (saved locally).
# First time - you'll be prompted for provider and API key
backmap analyze "GET /api/users"
# Subsequent runs use your saved configuration
backmap analyze "POST /api/orders" --refreshResults are cached in .backmap/analyses/ and keyed by source file content hash — re-analysis is only triggered when source code changes or --refresh is passed.
Frontend Usage Scanning
Find where each backend endpoint is called in your frontend code:
backmap scan ./backend --usage-scan ./frontendDetects calls via fetch(), axios, useFetch / $fetch (Nuxt), React Query, and generic URL pattern matching. Each detected usage includes source location, call expression, HTTP client, and a confidence score.
Endpoints with zero usages are flagged as potentially dead — queryable via get_dead_endpoints in MCP.
LLM Enrichment
Add AI-generated descriptions and examples to your endpoints. When you first run this command, you'll be prompted to select a provider and enter your API key:
# First time - you'll be prompted
backmap scan ./backend --enrich --enrich-provider anthropic
# Or specify provider upfront (still prompts for key if not set)
backmap scan ./backend --enrich --enrich-provider mistralSupports anthropic and mistral providers for enrichment. Enriched fields are explicitly tagged so you can distinguish parsed (deterministic) data from AI-generated content.
Your API key is saved locally in .env and never sent to backmap servers - it's only used to call the LLM provider directly.
API Key Management
When you run commands that require LLM providers (learn, analyze, scan --enrich), backmap automatically prompts you to set up your API key if it's not already configured.
Example first-time flow:
$ backmap learn ./frontend
┌──────────────────────────────────────────────────┐
│ 🤖 LLM Provider Setup │
│ │
│ This feature requires an LLM provider for │
│ AI-powered analysis. │
│ Your API key will be saved locally in .env and │
│ never sent to backmap servers. │
│ It's only used to call the LLM provider │
│ directly. │
└──────────────────────────────────────────────────┘
? Select LLM provider:
❯ Anthropic (Claude)
Mistral AI
OpenAI (GPT)
Ollama (Local - No API Key)
? Enter your Anthropic (Claude) API key: **********************
✓ API key saved to .env (local, not committed to git)
✓ Provider set to Anthropic (Claude)Your configuration is remembered - subsequent runs use your saved provider and key.
Manual setup: You can also manually add to .env:
echo "BACKMAP_LLM_API_KEY=your-api-key-here" >> .envSee LLM_SETUP_GUIDE.md for detailed setup instructions, CI/CD integration, and troubleshooting.
Config File
Create .backmap.yaml in your project root:
project:
framework: express # auto-detected if omitted
scan:
include:
- src/**
exclude:
- src/test/**
incremental: true
output:
dir: docs/api
formats:
- openapi
- markdown
save: true
enrichment:
enabled: false
provider: anthropic
usage:
enabled: false
frontendDir: ../frontend
exclude:
- node_modules/**
mcp:
transport: stdioscan.include and scan.exclude are applied during scanning and filter which discovered backend files are parsed.
How It Works
- Static parsing — Tree-sitter AST analysis extracts routes, params, types, and auth decorators
- LLM enrichment (opt-in) — Generates descriptions and examples from parsed metadata
- Usage scanning (opt-in) — Finds where each endpoint is called in frontend code
- Export — Converts to OpenAPI 3.1, Markdown, or Postman Collection
- Serve — Local system map explorer with schema graphs and scan diffs
- MCP — Exposes your API map as tools for AI coding assistants
Source code never leaves your machine. The CLI is fully offline by default.
