postgres-mcp-readonly
v1.0.0
Published
A secure, read-only PostgreSQL Model Context Protocol (MCP) server for safe database introspection and querying
Maintainers
Readme
PostgreSQL MCP Server
A secure, read-only PostgreSQL Model Context Protocol (MCP) server that provides safe database introspection and querying capabilities. Built with TypeScript for enhanced type safety and reliability.
Overview
This MCP server enables AI assistants and other MCP clients to safely interact with PostgreSQL databases through a read-only interface. It provides schema inspection, parameterized queries, table previews, change tracking, and row counting while preventing any data modifications.
Features
🔒 Security First
- Read-only enforcement - Blocks all write operations (INSERT, UPDATE, DELETE, etc.)
- SQL injection protection - Validates identifiers and sanitizes queries
- Automatic LIMIT enforcement - Prevents unbounded result sets
- Query timeouts - Prevents long-running queries from blocking resources
- Error sanitization - Prevents leakage of sensitive connection details
- Transaction isolation - All queries run in READ ONLY transactions
🛠️ Tools Provided
- db.schema - Inspect database structure
- db.query - Execute parameterized SELECT queries
- db.preview - Quick table preview
- db.watch - Poll for incremental changes
- db.count - Get exact row counts
📊 Resources
- schema-summary (
pg://schema/summary) - Table list with approximate row counts - schema-full (
pg://schema/full) - Complete schema with columns, keys, and relationships
Installation
Prerequisites
- Node.js 18+
- PostgreSQL database (accessible via network)
Setup
Clone or download this project
cd pg-mcp-serverInstall dependencies
npm installBuild the TypeScript code
npm run buildConfigure environment variables
Create a
.envfile:DATABASE_URL=postgres://username:password@localhost:5432/database_name STATEMENT_TIMEOUT_MS=5000 MAX_ROWS=500Test the connection
npm start
Configuration
Environment Variables
| Variable | Required | Default | Description |
| ---------------------- | -------- | ------- | ----------------------------- |
| DATABASE_URL | ✓ | - | PostgreSQL connection string |
| STATEMENT_TIMEOUT_MS | ✗ | 5000 | Query timeout in milliseconds |
| MAX_ROWS | ✗ | 500 | Default maximum rows returned |
Connection String Format
postgres://username:password@host:5432/database_name
postgresql://username:password@host:5432/database_nameTools Documentation
1. db.schema
Inspect database schema information.
Parameters:
mode(optional):"summary"or"full"(default:"summary")filter(optional): Filter tables by name or schema (case-insensitive)
Examples:
// Get table list with row counts
{
"mode": "summary"
}
// Get full schema with columns and keys
{
"mode": "full"
}
// Filter specific tables
{
"mode": "full",
"filter": "users"
}Response (summary):
{
"mode": "summary",
"tables": [
{
"schema": "public",
"table": "users",
"approxRows": 1250
}
]
}Response (full):
{
"mode": "full",
"schemas": {
"public": {
"users": {
"columns": [
{
"name": "id",
"dataType": "integer",
"udtName": "int4",
"nullable": false,
"default": "nextval('users_id_seq'::regclass)",
"position": 1
}
],
"primaryKey": ["id"],
"foreignKeys": []
}
}
}
}2. db.query
Execute a read-only SELECT query with optional parameters.
Parameters:
sql(required): SELECT query (with or without LIMIT)params(optional): Array of parameter values for $1, $2, etc.maxRows(optional): Maximum rows to return (1-5000, default: 500)
Examples:
// Simple query
{
"sql": "SELECT * FROM users WHERE active = true"
}
// Parameterized query
{
"sql": "SELECT id, name, email FROM users WHERE country = $1 AND age > $2",
"params": ["USA", 25],
"maxRows": 100
}
// Query with existing LIMIT (will be honored if <= maxRows)
{
"sql": "SELECT * FROM orders ORDER BY created_at DESC LIMIT 10"
}Response:
{
"rowCount": 10,
"fields": ["id", "name", "email"],
"rows": [{ "id": 1, "name": "John Doe", "email": "[email protected]" }]
}Security Notes:
- Only SELECT and WITH (CTE) queries allowed
- Single statement only (no semicolons)
- Automatic LIMIT enforcement if not specified
- Query timeout: 5 seconds (default)
3. db.preview
Quick preview of table rows.
Parameters:
table(required): Table name (useschema.tableor justtable)limit(optional): Number of rows (1-500, default: 50)
Examples:
// Preview public.users table
{
"table": "users",
"limit": 20
}
// Preview from specific schema
{
"table": "analytics.events"
}Response:
{
"table": "public.users",
"rowCount": 20,
"rows": [{ "id": 1, "name": "Alice", "created_at": "2024-01-15T10:30:00Z" }]
}4. db.watch
Poll for incremental changes using cursor-based pagination.
Parameters:
table(required): Table namecursorColumn(optional): Column to track (default:"updated_at")lastCursor(optional): Last cursor value from previous callbatchSize(optional): Rows per batch (1-1000, default: 200)
Examples:
// Initial fetch (gets oldest records first)
{
"table": "orders",
"cursorColumn": "created_at"
}
// Subsequent fetch (pass lastCursor from previous response)
{
"table": "orders",
"cursorColumn": "created_at",
"lastCursor": "2024-01-15T14:23:45.123Z",
"batchSize": 100
}
// Track by numeric ID
{
"table": "logs",
"cursorColumn": "id",
"lastCursor": 5042
}Response:
{
"table": "public.orders",
"cursorColumn": "created_at",
"cursorType": "timestamp with time zone",
"lastCursor": "2024-01-15T15:30:00Z",
"rows": [...]
}Use Case:
- Real-time monitoring
- ETL/sync processes
- Audit log tracking
- Event streaming
5. db.count
Get exact row count for a table.
Parameters:
table(required): Table name (useschema.tableor justtable)
Examples:
// Count rows in public.users
{
"table": "users"
}
// Count in specific schema
{
"table": "analytics.pageviews"
}Response:
{
"table": "public.users",
"count": 15247
}Usage Examples
With Claude Desktop (MCP Client)
Add to your claude_desktop_config.json:
{
"mcpServers": {
"postgres": {
"command": "node",
"args": ["d:/code/node/pg-mcp-server/dist/server.js"],
"env": {
"DATABASE_URL": "postgres://user:pass@localhost:5432/mydb"
}
}
}
}Or if installed globally via npm:
{
"mcpServers": {
"postgres": {
"command": "postgres-mcp-readonly",
"env": {
"DATABASE_URL": "postgres://user:pass@localhost:5432/mydb"
}
}
}
}Example Conversation Flow
User: "Show me the database schema"
AI uses: db.schema with mode: "summary"
User: "How many users do we have?"
AI uses: db.count with table: "users"
User: "Show me the 10 most recent orders"
AI uses: db.query with SQL:
SELECT * FROM orders ORDER BY created_at DESC LIMIT 10User: "Watch for new signups"
AI uses: db.watch with table: "users", cursorColumn: "created_at"
Security Features
Query Validation
The server performs multiple security checks:
Keyword Blocklist - Prevents: INSERT, UPDATE, DELETE, DROP, ALTER, CREATE, TRUNCATE, GRANT, REVOKE, VACUUM, ANALYZE, REINDEX, COPY, CALL, DO, EXECUTE
Comment Stripping - Removes SQL comments to prevent obfuscation
Single Statement - Only one query per request (no semicolons)
SELECT-only - Must start with SELECT or WITH
Identifier Validation - Table/column names must match
[a-zA-Z_][a-zA-Z0-9_]*Parameterization - Supports bind parameters ($1, $2, etc.) to prevent injection
Error Sanitization
Database errors are sanitized to prevent leaking:
- Connection strings and passwords
- Server hostnames
- File system paths
- Overly verbose stack traces
Connection Safety
- Connection pooling with max 10 connections
- Statement timeout (5s default) prevents runaway queries
- Lock timeout (1s) prevents deadlock situations
- Idle transaction timeout (5s) frees stuck connections
- Graceful shutdown on SIGINT/SIGTERM
Best Practices
For AI Assistants
- Always check schema first - Use
db.schemabefore querying unknown tables - Use parameterization - Never concatenate user input into SQL strings
- Start with small limits - Use low
maxRowsfor exploratory queries - Use db.count for totals - Don't SELECT COUNT(*) manually
- Handle errors gracefully - Sanitized errors are safe to show users
For Database Admins
- Use read-only database user - Grant only SELECT permissions
- Monitor connection usage - Set appropriate pool size
- Adjust timeouts - Based on your query complexity
- Enable query logging - In PostgreSQL for audit trail
- Use SSL connections - Add
?sslmode=requireto DATABASE_URL
Performance Tips
- Ensure indexed columns - Especially for
db.watchcursor columns - Use filters in db.schema - Don't fetch full schema repeatedly
- Keep maxRows reasonable - Large result sets slow serialization
- Add indexes on sort columns - For ORDER BY performance
Troubleshooting
Connection Issues
Problem: Missing DATABASE_URL error
Solution: Create .env file with valid connection string
Problem: ECONNREFUSED or connection timeout
Solution:
- Verify PostgreSQL is running
- Check host/port in DATABASE_URL
- Ensure firewall allows connections
- Test with
psqlcommand line first
Problem: password authentication failed
Solution: Verify username/password in DATABASE_URL
Query Errors
Problem: Blocked keyword detected: insert
Solution: This is intentional - only SELECT queries are allowed
Problem: Only SELECT queries are allowed
Solution: Ensure query starts with SELECT or WITH, not EXPLAIN, SHOW, etc.
Problem: statement timeout
Solution:
- Increase STATEMENT_TIMEOUT_MS
- Optimize query with indexes
- Reduce dataset with WHERE clause
Schema Issues
Problem: relation "table_name" does not exist
Solution:
- Check table name spelling
- Use
schema.tableif not inpublicschema - Run
db.schemato see available tables
Development
Running Locally
# Set environment variables
export DATABASE_URL="postgres://localhost:5432/testdb"
# Build and run server
npm run build
npm start
# Or use dev mode (builds and runs)
npm run devTesting with MCP Inspector
# Build first
npm run build
# Then inspect
npx @modelcontextprotocol/inspector node dist/server.jsLicense
MIT
Contributing
Contributions welcome! Please ensure:
- Security best practices maintained
- All tools remain read-only
- Tests pass (if added)
- Documentation updated
Support
For issues or questions:
- Check this README first
- Review PostgreSQL connection docs
- Test with
psqlto isolate database issues - Open an issue with sanitized error messages
Remember: This server is read-only by design. For database modifications, use traditional database tools or separate admin interfaces.
