sql-chatbot-agent
v1.1.3
Published
AI chatbot for any PostgreSQL app — auto-discovers schema, indexes code, executes SQL, streams answers via chat widget. Run with npx or use as Express middleware.
Maintainers
Readme
sql-chatbot-agent
AI-powered database chatbot that runs as a standalone CLI or Express middleware. Point it at any PostgreSQL database and get an AI assistant that can query your data, explain your code, and help users navigate your app.
Zero configuration required -- it auto-discovers your database schema, indexes your codebase, and serves a chat widget.
Quick Start (CLI)
The fastest way to get started is the standalone CLI. No existing Express app needed.
Install globally
npm install -g sql-chatbot-agentInitialize a config file
sql-chatbot-agent initThis creates a chatbot.config.json in the current directory and adds it to .gitignore.
Edit the file with your database URL, then start the chatbot:
sql-chatbot-agentThe chatbot will be running at http://localhost:3456 with a chat widget ready to use.
By default, it uses OpenRouter (free models, requires a free API key from openrouter.ai/keys). See Providers for other options.
One-liner with npx
# OpenRouter (free models -- get key at https://openrouter.ai/keys)
npx sql-chatbot-agent --db postgresql://localhost/mydb --key sk-or-v1-xxx --code ./app
# With Groq (cloud, faster)
npx sql-chatbot-agent --db postgresql://localhost/mydb --provider groq --key gsk_xxx --code ./app
# Or use environment variables
export OPENROUTER_API_KEY=sk-or-v1-xxx
npx sql-chatbot-agent --db postgresql://localhost/mydb --code ./appConfiguration
chatbot.config.json
Created by sql-chatbot-agent init:
{
"databaseUrl": "postgresql://user:password@localhost:5432/your_database",
"provider": "openrouter",
"llmApiKey": "",
"codePaths": ["./src"],
"port": 3456,
"secret": ""
}| Field | Type | Description |
|-------|------|-------------|
| databaseUrl | string | PostgreSQL connection URL (required) |
| provider | string | LLM provider: openrouter, groq, ollama, or openai (default: auto-detect) |
| llmApiKey | string | API key for the LLM provider (not needed for OpenRouter or Ollama) |
| llmModel | string | Model name override (default: provider-specific) |
| llmBaseUrl | string | API base URL override (default: provider-specific) |
| codePaths | string[] | Directories to scan for route detection, enum discovery, and business logic context (default: ["./src"]) |
| port | number | Port for the standalone server (default: 3456) |
| secret | string | Secret token for authentication (optional, recommended for production) |
Backward compat:
groqApiKeyis still accepted and mapped tollmApiKeyautomatically.
Code Indexing
The codePaths option (or --code CLI flag) controls which directories the chatbot scans for source code. This powers three features:
- Route detection — navigation and guidance answers (Express, Rails, Django, Next.js, etc.)
- Enum & constant discovery — model-level enums (Rails
enum, Djangochoices, TypeORM decorators, etc.) are surfaced as context for accurate SQL generation - Business logic context — validation rules, calculations, and domain logic help the LLM generate better queries
Framework-Specific Paths
| Framework | Recommended codePaths |
|-----------|------------------------|
| Express / React / Next.js / Hono | ["./src"] |
| Rails | ["./app", "./config"] |
| Django | ["./myapp"] (your app directories) |
| Laravel | ["./app", "./routes"] |
| Flask / FastAPI | ["./app"] or ["."] |
| Spring Boot (Java/Kotlin) | ["./src/main/java"] or ["./src/main/kotlin"] |
| Go (Gin / Echo / Fiber) | ["./cmd", "./internal"] |
| Phoenix / Elixir | ["./lib"] |
| SvelteKit / Nuxt | ["./src"] |
| ASP.NET | ["./Controllers"] |
| Rust (Actix / Axum) | ["./src"] |
| Sinatra | ["."] |
Tip: When in doubt, point to your project root. The indexer automatically skips node_modules, .git, dist, build, vendor, target, __pycache__, etc.
Supported File Types
.js, .ts, .jsx, .tsx, .rb, .py, .erb, .vue, .php, .java, .go, .cs, .ex, .exs, .svelte, .kt, .rs, .dart, .scala
Limits
- Maximum 2000 files indexed (configurable)
- Files are scanned on first request (lazy initialization)
- Use
/chatbot/api/refreshto re-index after code changes
CLI Flags
| Flag | Short | Description |
|------|-------|-------------|
| --db | | PostgreSQL connection URL |
| --provider | | LLM provider: ollama, groq, or openai |
| --key | | API key for the LLM provider |
| --model | | Model name override |
| --base-url | | API base URL override |
| --code | | Directory to index (repeatable: --code ./app --code ./config) |
| --port | -p | Port for the standalone server |
| --secret | | Secret token for authentication |
Environment Variables
| Variable | Description |
|----------|-------------|
| DATABASE_URL | PostgreSQL connection URL |
| LLM_PROVIDER | LLM provider: ollama, groq, or openai |
| LLM_API_KEY | API key for the LLM provider |
| OPENROUTER_API_KEY | OpenRouter API key (auto-detects provider) |
| LLM_MODEL | Model name override |
| LLM_BASE_URL | API base URL override |
| GROQ_API_KEY | Groq API key (backward compat, same as LLM_API_KEY) |
| CHATBOT_SECRET | Secret token for authentication |
| PORT | Port for the standalone server |
Priority Order
Configuration is resolved in this order (highest priority first):
- CLI flags (
--db,--key, etc.) - Environment variables (
DATABASE_URL,GROQ_API_KEY, etc.) - Config file (
chatbot.config.json)
Providers
sql-chatbot-agent works with any OpenAI-compatible LLM API. Four providers are preconfigured:
| Provider | Type | API Key | Default Model | Rate Limits |
|----------|------|---------|---------------|-------------|
| OpenRouter | Cloud | Required (openrouter.ai/keys) | openrouter/free | 50 req/day free, 1000/day with $10 credit |
| Groq | Cloud | Required (console.groq.com) | llama-3.3-70b-versatile | Free tier: 100K tokens/day |
| Ollama | Local | Not needed | llama3.1:8b | None (runs on your machine) |
| OpenAI | Cloud | Required (platform.openai.com) | gpt-4o-mini | Pay-per-use |
Auto-detection
If you don't specify --provider, the chatbot auto-detects:
OPENROUTER_API_KEYset → usesopenrouter- Other API key provided → uses
groq
OpenRouter (Default)
OpenRouter provides free access to 29+ models. Get a free API key at openrouter.ai/keys.
# Set your key via env var or --key flag
export OPENROUTER_API_KEY=sk-or-v1-xxx
npx sql-chatbot-agent --db postgresql://localhost/mydb --code ./srcRate limits: 50 free model requests/day without credits. Add $10 credit (not consumed by free models) to unlock 1000/day. Resets daily at midnight UTC.
To use a specific free model from OpenRouter's free collection:
npx sql-chatbot-agent --db postgresql://localhost/mydb --key sk-or-v1-xxx --model google/gemma-3-1b-it:freeGroq (Cloud, Free Tier)
Faster than OpenRouter but requires a free API key.
npx sql-chatbot-agent --db postgresql://localhost/mydb --provider groq --key gsk_xxxOllama (Local, Offline)
For fully offline use. Requires installing Ollama and pulling a model.
ollama pull llama3.1:8b && ollama serve
npx sql-chatbot-agent --db postgresql://localhost/mydb --provider ollamaOpenAI
npx sql-chatbot-agent --db postgresql://localhost/mydb --provider openai --key sk-xxxAny OpenAI-Compatible API
Use --base-url and --model to connect to any OpenAI-compatible API:
npx sql-chatbot-agent --db postgresql://localhost/mydb \
--base-url http://my-llm-server:8080/v1 \
--key my-api-key \
--model my-model-nameAuthentication
For production use, set a secret token to protect the chatbot API.
Setup
Provide a secret via any configuration method:
# CLI flag
sql-chatbot-agent --secret my-secret-token
# Environment variable
CHATBOT_SECRET=my-secret-token sql-chatbot-agent
# Or in chatbot.config.json
# { "secret": "my-secret-token" }How It Works
When a secret is configured:
- Widget script (
/chatbot/widget.js) -- sets achatbot_tokencookie on load - API endpoints (
/chatbot/api/ask,/chatbot/api/refresh) -- validate the cookie or aBearertoken in theAuthorizationheader - Health endpoint (
/chatbot/api/health) -- remains open (no sensitive data)
Requests without a valid token receive a 401 Unauthorized response.
Without a Secret
If no secret is configured, the chatbot API is open to anyone. The CLI prints a warning on startup:
Warning: No secret configured. The chatbot API is open to anyone.This is fine for local development but should not be used in production.
.gitignore
Add chatbot.config.json to your .gitignore to avoid committing secrets. The init command does this automatically.
Express Middleware (Advanced)
If you already have an Express app and want to embed the chatbot into it, use the middleware directly:
1. Install
npm install sql-chatbot-agent2. Add to your Express app
const express = require('express');
const { sqlChatbot } = require('sql-chatbot-agent');
const app = express();
app.use('/chatbot', sqlChatbot({
databaseUrl: process.env.DATABASE_URL,
provider: 'openrouter', // or 'groq', 'ollama', 'openai'
llmApiKey: process.env.LLM_API_KEY, // not needed for openrouter/ollama
codePaths: ['./src'],
secret: process.env.CHATBOT_SECRET,
}));
app.listen(3000);3. Add the widget to your HTML
<script src="/chatbot/widget.js"></script>A chat bubble appears in the bottom-right corner of your page.
How It Works
When a user asks a question, the chatbot:
- Classifies the question (data query, code question, navigation, etc.)
- Routes to the right handler:
- Data questions -- generates SQL, executes it read-only, explains the results
- Code questions -- searches your indexed codebase, explains the relevant code
- Navigation -- uses detected routes and page context to guide the user
- Streams the answer back via Server-Sent Events (SSE)
User: "How many active users signed up this month?"
-> Classifies as "data" question
-> Generates: SELECT COUNT(*) FROM users WHERE active = true AND created_at >= '2026-02-01'
-> Executes read-only
-> Streams: "There are 142 active users who signed up this month."API Endpoints
When mounted at /chatbot (or running standalone), the following endpoints are available:
| Endpoint | Method | Description |
|----------|--------|-------------|
| /chatbot/widget.js | GET | Serves the chat widget bundle |
| /chatbot/api/ask | POST | Main chat endpoint (SSE streaming) |
| /chatbot/api/health | GET | Health check with table/file counts |
| /chatbot/api/refresh | POST | Re-discover schema and re-index code |
POST /api/ask
Request body:
{
"question": "How many users are there?",
"pageContext": "{\"url\": \"...\", \"title\": \"...\"}",
"history": [
{ "role": "user", "content": "previous question" },
{ "role": "assistant", "content": "previous answer" }
]
}Response: SSE stream with events:
data: {"type":"classifying"}
data: {"type":"classified","questionType":"data","confidence":0.9}
data: {"type":"sql","sql":"SELECT COUNT(*) FROM users"}
data: {"type":"executing"}
data: {"type":"token","content":"There"}
data: {"type":"token","content":" are"}
data: {"type":"token","content":" 1,234"}
data: {"type":"token","content":" users."}
data: {"type":"done"}GET /api/health
{
"status": "ok",
"tables": 25,
"codeFiles": 150
}Features
- CLI standalone mode -- run as a standalone server with zero boilerplate
- Built-in authentication -- secret token authentication with cookie and Bearer token support
- Auto-discovers your database schema -- tables, columns, types, foreign keys, indexes
- Indexes your codebase -- scans JS/TS/Ruby/Python files, detects Express/React/Next.js/Rails routes
- SQL safety -- validates queries against blocklists, executes in READ ONLY transactions, blocks destructive operations
- Sensitive data filtering -- automatically hides columns matching patterns like
password,secret,api_key,ssn - Chat widget with Shadow DOM -- no CSS conflicts with your app
- Conversation history -- the widget sends message history for contextual follow-ups
- Multi-provider -- Ollama (local, free), Groq (cloud, free tier), OpenAI, or any OpenAI-compatible API
- Lazy initialization -- schema discovery and code indexing happen on first request, not at startup
Supported Frameworks
The code indexer detects routes and patterns from 17+ frameworks:
JavaScript/TypeScript: Express.js, Fastify, Hono, Koa, React Router, Next.js (pages + app router), NestJS, SvelteKit, Nuxt Ruby: Rails, Sinatra Python: Django, FastAPI, Flask Java/Kotlin: Spring Boot Go: Gin, Echo, Fiber C#: ASP.NET (minimal APIs + attribute routing) Elixir: Phoenix
Security
- All SQL queries run inside
SET TRANSACTION READ ONLY-- no writes possible - Dangerous SQL keywords (
DROP,DELETE,TRUNCATE,ALTER, etc.) are blocked before execution - System catalogs (
pg_catalog,information_schema) are blocked from queries - Sensitive columns are automatically filtered from the schema summary sent to the LLM
- Built-in secret token authentication for production deployments
chatbot.config.jsonis automatically added to.gitignoreby theinitcommand
Requirements
- Node.js >= 18
- PostgreSQL database
- An LLM provider: OpenRouter (free key), Groq (cloud, free tier), Ollama (local, free), OpenAI, or any OpenAI-compatible API
License
MIT
