@mg21st/dev-assist
v1.0.4
Published
All-in-one developer toolkit: Auto Test Generator, API Docs Generator, and API Testing UI
Maintainers
Readme
DevAssist 🚀
All-in-one developer toolkit: Auto Test Generator, API Docs Generator & API Testing UI
Features
- 🧪 Auto Test Generator - AST-based test stub generation or LLM-powered real test generation for JS/TS
- 🤖 LLM Support - Generate production-quality tests with OpenAI, Gemini, Groq, OpenRouter, Anthropic, or local Ollama
- 📚 API Docs Generator - Extract Express routes and generate OpenAPI specs + Markdown
- 🌐 Local UI Server - Beautiful React UI with dashboard, tests viewer, API docs browser
- ⚡ API Testing Tab - Lightweight Postman-like HTTP client built-in
- 🧙 Interactive CLI Wizard - Guided setup experience
Installation
npm install -g @mg21st/dev-assistUsage
Interactive wizard
dev-assistGenerate tests
dev-assist generate --tests --source ./src --output ./__tests__ --framework jestGenerate API docs
dev-assist generate --docs --source ./src --output ./docsGenerate both
dev-assist generate --source ./srcStart UI server
dev-assist serve --port 3000Configuration
Create dev-assist.config.js in your project root:
module.exports = {
testFramework: 'jest', // 'jest' | 'vitest'
sourceDir: './src',
testOutputDir: './__tests__',
docsOutputDir: './docs',
port: 3000,
watchMode: false,
baseUrl: 'http://localhost:3000',
// Optional: configure an LLM to generate real, production-quality tests
// llm: { provider: 'openai', model: 'gpt-4o', apiKey: process.env.OPENAI_API_KEY },
};LLM-Powered Test Generation
When an llm configuration is provided, DevAssist reads each source file in full and sends it to the LLM with a prompt that requests comprehensive test coverage: happy paths, edge cases, error handling, and boundary conditions. The result is a real test file, not just stubs.
If the LLM call fails for any file, DevAssist automatically falls back to AST-based stub generation for that file.
Supported LLM Providers
| Provider | provider value | Requires API key |
|----------|-----------------|-----------------|
| Ollama (local) | ollama | No |
| OpenAI | openai | Yes |
| Google Gemini | gemini | Yes |
| Groq | groq | Yes |
| OpenRouter | openrouter | Yes |
| Anthropic Claude | anthropic | Yes |
LLM Configuration Examples
// Ollama (local, no key required)
llm: { provider: 'ollama', model: 'llama3' }
// OpenAI
llm: { provider: 'openai', model: 'gpt-4o', apiKey: process.env.OPENAI_API_KEY }
// Google Gemini
llm: { provider: 'gemini', model: 'gemini-1.5-flash', apiKey: process.env.GEMINI_API_KEY }
// Groq
llm: { provider: 'groq', model: 'llama-3.3-70b-versatile', apiKey: process.env.GROQ_API_KEY }
// OpenRouter
llm: { provider: 'openrouter', model: 'openai/gpt-4o', apiKey: process.env.OPENROUTER_API_KEY }
// Anthropic
llm: { provider: 'anthropic', model: 'claude-3-5-sonnet-20241022', apiKey: process.env.ANTHROPIC_API_KEY }You can also set a custom baseUrl for any provider (useful for self-hosted models or proxies):
llm: { provider: 'ollama', model: 'mistral', baseUrl: 'http://my-server:11434' }Via the Interactive Wizard
When running dev-assist interactively and choosing test generation, the wizard will ask whether you want to use an LLM and guide you through provider and model selection.
Via the UI Server API
When using dev-assist serve, you can pass an llm object in the body of POST /api/generate/tests:
{
"sourceDir": "./src",
"outputDir": "./__tests__",
"framework": "jest",
"llm": {
"provider": "openai",
"model": "gpt-4o",
"apiKey": "sk-..."
}
}Development Setup
Prerequisites
- Node.js >= 16
- npm >= 7
Build
# Install root dependencies
npm install
# Install UI dependencies
cd ui && npm install && cd ..
# Build everything
npm run buildRun in development
npm run devArchitecture
dev-assist/
bin/ # CLI entry point
src/
cli/ # Commander + Inquirer CLI
generators/ # Test & docs generators
llm/ # LLM provider abstraction (OpenAI, Gemini, Groq, OpenRouter, Anthropic, Ollama)
parser/ # @babel/parser AST parser
server/ # Express backend API
shared/ # TypeScript types
ui/ # React + Vite + Tailwind frontendAPI Endpoints (Server)
| Method | Path | Description |
|--------|------|-------------|
| GET | /api/config | Current configuration (LLM API key is redacted) |
| GET | /api/summary | Project statistics |
| GET | /api/files | Parsed file list |
| GET | /api/tests | Generated test files |
| GET | /api/docs | API documentation |
| POST | /api/generate/tests | Trigger test generation (accepts optional llm body field) |
| POST | /api/generate/docs | Trigger docs generation |
| POST | /api/proxy | Proxy HTTP requests (API testing) |
License
MIT
