express-genix
v4.5.2
Published
Production-grade CLI to generate Express apps with JWT, RBAC, GraphQL, TypeScript, Prisma, MongoDB, PostgreSQL, file uploads, email, background jobs, and more
Downloads
1,650
Maintainers
Readme
Express-Genix
A production-grade CLI tool that generates Express.js applications with best-in-class defaults. Scaffold a complete REST API in seconds — with TypeScript, authentication, Prisma/Mongoose/Sequelize, Docker, CI/CD, and more.
Features
Languages & ORMs
- JavaScript or TypeScript
- MongoDB (Mongoose), PostgreSQL (Sequelize), PostgreSQL (Prisma), or no database
- Sequelize CLI migrations for PostgreSQL, Prisma migrations ready out of the box
Security & Auth
- JWT access + refresh tokens with token blacklist logout
- Role-Based Access Control (RBAC) — admin, moderator, user roles with permission system
- Admin panel routes for user management (list, view, update roles, delete)
- Password reset flow (forgot-password / reset-password with crypto tokens)
- Optional Redis-backed blacklist for production multi-instance deployments
- Zod request validation with pre-built schemas (register, login, reset, etc.)
- bcrypt password hashing, input sanitization (
validator) - Helmet, CORS, environment validation on startup
- Auto-generated cryptographically secure JWT secrets
- Soft deletes with restore capability
API & Documentation
- Swagger UI + swagger-jsdoc annotation-based docs with example request/response bodies
- GraphQL (Apollo Server) with type definitions and resolvers
- Consistent
{ success, data, meta }response envelope - API versioning (
/api/v1/prefix) - Paginated list endpoints
- Request ID / correlation tracking
- Response caching middleware (Redis-backed, configurable TTL)
Services & Infrastructure
- Email service (Nodemailer) with welcome and password reset emails
- File uploads (Multer) with type filtering and size limits
- Background jobs (BullMQ) with Redis-backed queues and workers
- Prometheus metrics (
/metricsendpoint with prom-client) - Audit logging middleware with request tracking and sensitive field redaction
AI & Agent Integration
- MCP Server — exposes your API as tools for AI agents (Claude Desktop, Cursor, Copilot)
- LangChain / LangGraph AI service with chat, streaming (SSE), prompt chains, and ReAct agent
- Provider-agnostic: OpenAI, Anthropic, Google Gemini, Ollama (local)
- AI-powered CLI:
express-genix ai "describe your app"generates a project from natural language - Coda VS Code extension — in-editor AI chat for Express Genix projects (auto-prompted on AI/MCP projects)
Developer Experience
- Interactive prompts — pick language, database, features via checkbox
- Winston or Pino logger (you choose)
- ESLint (Airbnb) + Prettier pre-configured
- Jest + Supertest with coverage
- Rate limiting with configurable window/max
Production Ready
- Docker & Docker Compose (multi-stage, non-root user)
- GitHub Actions CI/CD (matrix testing, DB services)
- Node.js clustering with graceful shutdown
- Rollback on generation failure
- Git init + initial commit
Post-Init
express-genix add <feature>— bolt on auth, websocket, docker, cicd, or prisma to an existing project
Quick Start
Install globally
npm install -g express-genixOr use with npx:
npx express-genix initCreate a project
express-genix initYou'll be prompted for:
- Project name
- Language — JavaScript or TypeScript
- Database — MongoDB, PostgreSQL (Sequelize), PostgreSQL (Prisma), or None
- Features — Auth, Rate Limiting, Swagger, Redis, Docker, CI/CD, WebSocket, Request ID, Email, File Uploads, Soft Deletes, Audit Logging, Prometheus Metrics, API Versioning, Background Jobs, GraphQL, MCP Server, AI/LLM Service
- Logger — Winston or Pino
- AI Provider (if AI selected) — OpenAI, Anthropic, Google Gemini, or Ollama (local)
The CLI generates your project, installs dependencies, formats code, and creates an initial git commit.
Run it
cd my-express-app
npm run dev- API: http://localhost:3000/api
- Swagger Docs: http://localhost:3000/api-docs
- Health Check: http://localhost:3000/health
Add features to an existing project
cd my-express-app
express-genix add docker # Adds Dockerfile, docker-compose.yml, .dockerignore
express-genix add cicd # Adds GitHub Actions CI workflow
express-genix add auth # Adds JWT auth, controllers, routes, middleware
express-genix add websocket # Adds Socket.io setup
express-genix add prisma # Adds Prisma schema, client config, migrationsGenerate from natural language (AI)
Requires OPENAI_API_KEY or ANTHROPIC_API_KEY in your environment.
express-genix ai "a task manager API with auth, real-time updates, and email notifications"The CLI interprets your description, maps it to features, shows the config for confirmation, then generates the project.
Generated Project Structure
my-express-app/
├── src/
│ ├── config/ # Database, Swagger, WebSocket, queue configuration
│ ├── controllers/ # Route handlers (auth, user, admin, example)
│ ├── graphql/ # GraphQL type definitions & resolvers (if selected)
│ ├── jobs/ # BullMQ background workers (if selected)
│ ├── mcp/ # MCP server for AI agent tools (if selected)
│ ├── agents/ # LangGraph ReAct agent (if selected)
│ ├── middleware/ # Auth, RBAC, validation, error handling, uploads, metrics, audit log
│ ├── models/ # Database models (Mongoose/Sequelize/Prisma)
│ ├── routes/ # API route definitions with Swagger annotations
│ ├── services/ # Business logic layer (auth, user, email)
│ ├── utils/ # Logger, errors, response helpers, validators
│ ├── app.js|ts # Express app setup
│ └── server.js|ts # Server + clustering + WebSocket + workers
├── tests/ # Jest + Supertest suites
├── prisma/ # Prisma schema (if selected)
├── migrations/ # Sequelize migrations (if PostgreSQL + Sequelize)
├── seeders/ # Sequelize seeders (if PostgreSQL + Sequelize)
├── uploads/ # File upload directory (if selected)
├── mcp.json # MCP client config for Claude Desktop (if selected)
├── .github/workflows/ # CI/CD pipeline (if selected)
├── .env # Auto-generated environment config
├── .env.example # Template for team sharing
├── Dockerfile # Multi-stage Docker build (if selected)
├── docker-compose.yml # Full stack compose (if selected)
└── package.jsonAPI Endpoints (with database + auth)
When API versioning is enabled, all
/api/*routes become/api/v1/*.
Authentication
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | /api/auth/register | Register a new user |
| POST | /api/auth/login | Login (returns access + refresh tokens) |
| POST | /api/auth/refresh | Refresh access token |
| POST | /api/auth/logout | Logout (blacklists token) |
| POST | /api/auth/forgot-password | Request password reset email |
| POST | /api/auth/reset-password | Reset password with token |
Users (protected)
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | /api/users/profile | Get current user |
| PUT | /api/users/profile | Update profile |
| DELETE | /api/users/profile | Delete account (soft delete if enabled) |
Admin (RBAC — admin role required)
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | /api/admin/users | List all users (paginated) |
| GET | /api/admin/users/:id | Get user by ID |
| PUT | /api/admin/users/:id/role | Update user role |
| DELETE | /api/admin/users/:id | Delete user |
| POST | /api/admin/users/:id/restore | Restore soft-deleted user |
File Uploads
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | /api/uploads/single | Upload a single file |
| POST | /api/uploads/multiple | Upload multiple files (max 10) |
Background Jobs (BullMQ)
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | /api/jobs | Dispatch a background job |
| GET | /api/jobs/:id | Get job status |
Observability & Health
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | /health | Health check (uptime, DB, Redis, memory) |
| GET | /metrics | Prometheus metrics (prom-client) |
| GET | /graphql | GraphQL Playground (Apollo Server) |
AI / LLM (LangChain)
When you select the AI / LLM Service feature, you'll be asked to pick a provider:
| Provider | Models | API Key Required |
|----------|--------|------------------|
| OpenAI | GPT-4o, GPT-4o-mini | OPENAI_API_KEY |
| Anthropic | Claude Sonnet, Claude Opus | ANTHROPIC_API_KEY |
| Google Gemini | Gemini 2.0 Flash | GOOGLE_API_KEY |
| Ollama | Llama 3, Mistral, etc. | None (runs locally) |
The CLI generates your .env with the correct provider, model, and API key variable — just fill in your key and run npm run dev.
For Ollama, install it from ollama.com and pull a model:
ollama pull llama3Endpoints:
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | /api/ai/chat | Chat with AI (complete response) |
| POST | /api/ai/stream | Stream AI response (SSE) |
| POST | /api/ai/chain | Run a prompt template chain |
| POST | /api/ai/agent | Run LangGraph ReAct agent with tools |
Available Scripts
npm run dev # Development server with auto-reload
npm start # Production server with clustering
npm test # Run tests with coverage
npm run lint # Check ESLint
npm run lint:fix # Auto-fix ESLint issues
npm run format # Format with Prettier
npm run format:check # Verify formatting
npm run build # Compile TypeScript (TS projects only)
npm run db:migrate # Run Sequelize migrations (PostgreSQL)
npm run db:migrate:undo # Undo last migration
npm run prisma:migrate # Run Prisma migrations
npm run prisma:studio # Open Prisma StudioCLI Options
express-genix init --skip-install # Skip npm install (useful for CI)
express-genix init --skip-cleanup # Skip auto-formatting (for debugging)
express-genix add <feature> # Add feature to existing project
express-genix ai "<description>" # Generate project from natural languageEnvironment Variables
Generated .env includes auto-generated JWT secrets:
| Variable | Description | Default |
|----------|-------------|---------|
| NODE_ENV | Environment | development |
| PORT | Server port | 3000 |
| CORS_ORIGIN | Allowed origins | * |
| JWT_SECRET | Access token secret (auto-generated) | — |
| JWT_REFRESH_SECRET | Refresh token secret (auto-generated) | — |
| MONGO_URI / DATABASE_URL | Database connection string | — |
| REDIS_URL | Redis URL (token blacklist, caching, BullMQ) | redis://localhost:6379 |
| RATE_LIMIT_WINDOW_MS | Rate limit window (ms) | 900000 |
| RATE_LIMIT_MAX | Max requests per window | 100 |
| LOG_LEVEL | Logging level | info |
| SMTP_HOST | SMTP server host (email service) | — |
| SMTP_PORT | SMTP server port | 587 |
| SMTP_USER | SMTP username | — |
| SMTP_PASS | SMTP password | — |
| EMAIL_FROM | Default sender address | — |
| UPLOAD_MAX_SIZE | Max file upload size in bytes | 5242880 (5 MB) |
| UPLOAD_DIR | Upload destination directory | uploads |
| MCP_SERVER_NAME | MCP server display name | project name |
| AI_PROVIDER | LLM provider (set by CLI) | openai |
| AI_MODEL | Model name (set by CLI based on provider) | varies |
| AI_TEMPERATURE | Sampling temperature | 0.7 |
| AI_MAX_TOKENS | Max response tokens | 2048 |
| OPENAI_API_KEY | OpenAI API key (if using OpenAI) | — |
| ANTHROPIC_API_KEY | Anthropic API key (if using Anthropic) | — |
| OLLAMA_BASE_URL | Ollama server URL (if using Ollama) | http://localhost:11434 |
Docker
docker-compose up --buildThe generated docker-compose.yml includes services for your app and its dependencies (MongoDB/PostgreSQL, Redis) with health checks, volumes, and a shared network. The Dockerfile uses multi-stage builds, runs as a non-root user, and only copies production dependencies.
Coda VS Code Extension
When you select AI or MCP features, the CLI prompts you to install the Coda VS Code extension — an in-editor AI assistant that talks to your running project's AI endpoints.
What it does
- Sidebar chat — a chat panel in the VS Code activity bar that streams responses from your project's
/ai/chatand/ai/streamendpoints @codain Copilot Chat — type@codain the native Copilot Chat to interact with your project's AI agent- Agent mode — toggle between Chat (streaming) and Agent (LangGraph tool-calling) modes
- Auto-detection — activates automatically when it detects an Express Genix project
- Code actions — copy or insert code blocks from AI responses directly into your editor
Install manually
# From VS Code Marketplace
code --install-extension express-genix.coda-aiOr search "Coda AI" in the VS Code Extensions panel.
Configuration
In VS Code Settings (search "Coda"):
| Setting | Default | Description |
|---------|---------|-------------|
| coda.provider | openai | AI provider (openai, anthropic, gemini, ollama) |
| coda.model | (provider default) | Model override |
| coda.systemPrompt | Coding assistant prompt | Default system prompt |
Contributing
Contributions are welcome! Please read our Contributing Guide for details.
License
MIT © Joshua Maeba Nyamasege
Changelog
See CHANGELOG.md for version history.
