dotllm
v1.0.2
Published
Universal /init for AI coding assistants. Generates config files in the exact locations each tool expects (CLAUDE.md, GEMINI.md, AGENTS.md, .cursor/rules/, etc).
Maintainers
Readme
dotllm
Universal
/initfor AI coding assistants.
dotllm analyzes your project and generates context files in the exact locations each AI tool expects. It detects your stack (Next.js, Tailwind, Prisma, etc.) and creates strict coding rules that stop hallucinations.
🌐 Supported AI Tools
| Tool | Config File | Convention |
|------|-------------|------------|
| Cursor | .cursor/rules/rules.mdc | New structured rules directory |
| Claude Code | CLAUDE.md | At project root |
| Antigravity (Gemini) | GEMINI.md | At project root |
| Codex (OpenAI) | AGENTS.md | Open standard, cascading per directory |
| GitHub Copilot | .github/copilot-instructions.md | In .github directory |
| Windsurf | .windsurfrules | At project root |
⚡️ Quick Start
# Universal init - generates ALL config files for all supported AI tools
npx dotllm
# Or generate for a specific tool
npx dotllm --ide cursor
npx dotllm --ide claude-code
npx dotllm --ide antigravity
npx dotllm --ide codex
npx dotllm --ide vscode-copilot
npx dotllm --ide windsurf
# Generate for multiple tools
npx dotllm --ide cursor,claude-code,codex⚠️ The Problem
Every AI coding tool has its own convention for configuration files:
- Cursor wants
.cursor/rules/ - Claude Code wants
CLAUDE.md - Antigravity wants
GEMINI.md - Codex uses the
AGENTS.mdstandard - And so on...
Without proper config, AI assistants:
- ❌ Suggest
pages/router when you useapp/router - ❌ Use
yarnwhen you usepnpm - ❌ Write patterns that don't match your codebase
- ❌ Hallucinate libraries you don't have installed
✅ The Solution
dotllm is the universal /init command that:
- Scans your codebase - Detects 50+ technologies
- Generates rules - Creates strict, stack-aware coding guidelines
- Puts files in the right place - Each file goes exactly where the tool looks for it
No more guessing. No more manual setup. Just run npx dotllm and all your AI tools are configured.
📦 What It Detects
| Category | Supported Tech | |----------|----------------| | Languages | TypeScript, JavaScript, Python, Go, Rust, Java, Kotlin, Ruby, PHP, C#, Swift, Dart | | Frameworks | React, Vue, Angular, Svelte, Next.js, Nuxt, SvelteKit, Remix, Astro, Express, FastAPI, Django, NestJS, Flask | | Tooling | Vite, Webpack, Turbo, Nx, Docker, Kubernetes, Terraform | | Testing | Jest, Vitest, Playwright, Cypress, Pytest | | Databases | PostgreSQL, MySQL, MongoDB, Redis, Prisma, Drizzle, Supabase | | Linting | ESLint, Prettier, Biome, Ruff |
🛠️ Advanced Usage
Preview Detection
See what dotllm finds without writing any files:
npx dotllm --dry-runForce Update
Regenerate files and overwrite existing ones:
npx dotllm --forceVerbose Mode
See detailed detection info:
npx dotllm --verboseProgrammatic API
Use dotllm in your own scripts:
import { analyzeCodebase, generateAllIDEOutputs } from 'dotllm';
const analysis = await analyzeCodebase('./my-project');
const outputs = generateAllIDEOutputs(analysis, ['cursor', 'claude-code', 'codex']);
for (const output of outputs) {
console.log(`${output.filePath}: ${output.content}`);
}🧠 Philosophy
1. Put Files Where Tools Look Each AI tool has specific conventions. We follow them exactly. No arbitrary file names, no custom locations.
2. LLM-First Design Files are optimized for AI consumption with clear, prescriptive language to reduce hallucinations.
3. Universal Init One command, all tools. Whether you use Cursor, Claude Code, Antigravity, or Codex - you're covered.
4. Zero Cloud Dependencies Everything runs locally. No API calls, no data collection, no telemetry.
Contributing
We love contributions! Please see CONTRIBUTING.md to get started.
License
MIT © DotLLM
