postgresql-mcp
v1.0.1
Published
MCP server for PostgreSQL database interactions
Readme
PostgreSQL Model Context Protocol (MCP) Server
A comprehensive Model Context Protocol server that enables AI assistants to interact with PostgreSQL databases through a secure, feature-rich interface.
Repository: https://github.com/bleeding-gums/postgres-mcp
Features
Core Database Operations
- Connection Management: Connect to and manage multiple PostgreSQL databases simultaneously
- Query Execution: Execute SQL queries with parameter binding and result processing
- Transaction Support: Full ACID transaction management with commit/rollback capabilities
- Prepared Statements: Create, execute, and manage prepared statements for better performance
Advanced Query Features
- Query Timeout: Configurable timeouts to prevent long-running queries
- Query Validation: Built-in validation to prevent dangerous operations (DROP, DELETE without WHERE, etc.)
- Pagination Support: Efficient handling of large result sets with configurable page sizes
- Async Query Jobs: Execute long-running queries asynchronously with job tracking
Data Management
- Bulk Operations: Bulk insert, update, and delete operations for efficient data manipulation
- JSON Data Support: Specialized operations for JSON columns including search, update, and extraction
- Schema Synchronization: Bidirectional data sync between databases with conflict resolution
- Data Comparison: Compare data between sources with detailed difference reporting
Connection Features
- Health Monitoring: Real-time connection health checks and status reporting
- Connection Statistics: Track query counts, execution times, and connection metadata
- Connection Pooling: Efficient connection reuse and resource management
- SSL/TLS Support: Secure database connections with configurable SSL options
Data Import/Export
- Multiple Formats: Support for CSV, JSON, and other common data formats
- Schema-aware Import: Automatic type mapping and validation during data import
- Export Flexibility: Export data with custom queries and formatting options
Installation
npm install postgresql-mcpInstalling in AI Applications
Follow these steps to install and configure the PostgreSQL MCP server in various AI applications:
VS Code
- Install the MCP Extension or enable experimental MCP support
- Add the server configuration to your VS Code settings (
settings.json):
{
"mcp.servers": {
"postgres-mcp": {
"command": "npx",
"args": ["postgresql-mcp"]
}
}
}- Restart VS Code to activate the server
- The server will appear in the MCP connections panel
Cursor
- Open Cursor settings (
Ctrl+,orCmd+,) - Navigate to Extensions or MCP Settings
- Add the server configuration:
{
"mcp": {
"servers": {
"postgres-mcp": {
"command": "npx",
"args": ["postgresql-mcp"]
}
}
}
}- Reload the window to initialize the server
Claude Desktop
- Open Claude Desktop preferences
- Go to "Model Context Protocol" or "Integrations" section
- Click "Add Server" and enter:
{
"name": "PostgreSQL MCP",
"command": "npx",
"args": ["postgresql-mcp"]
}- Save and restart Claude Desktop
ChatGPT (with MCP support)
- If using ChatGPT with MCP capabilities, add to your configuration:
{
"mcpServers": {
"postgres": {
"command": "npx",
"args": ["postgresql-mcp"]
}
}
}- The server will be available when chatting with the model
Google Gemini (with MCP support)
- Configure MCP in your Gemini application settings:
{
"integrations": {
"mcp": {
"servers": {
"postgres-mcp": {
"command": "npx",
"args": ["postgresql-mcp"]
}
}
}
}
}- Restart the Gemini application to load the server
Antigravity (AI Editor)
- Open Antigravity settings/preferences
- Navigate to the "Extensions" or "MCP Servers" section
- Add a new server with the following configuration:
{
"mcp": {
"servers": {
"postgres-mcp": {
"command": "npx",
"args": ["postgresql-mcp"],
"env": {
"NODE_ENV": "production"
}
}
}
}
}- Apply settings and restart Antigravity
General Configuration Notes
- Make sure Node.js and npm are installed on your system
- The server can be started with
npx postgresql-mcpdirectly for testing - Environment variables for database connections can be set in the server configuration:
{
"mcp": {
"servers": {
"postgres-mcp": {
"command": "npx",
"args": ["postgresql-mcp"],
"env": {
"PGHOST": "localhost",
"PGPORT": "5432",
"PGDATABASE": "your_database",
"PGUSER": "your_username",
"PGPASSWORD": "your_password"
}
}
}
}
}Usage
The server implements the Model Context Protocol and supports the following methods:
postgres/connect
Establish a connection to a PostgreSQL database.
Parameters:
host(optional, default: 'localhost'): Database hostport(optional, default: 5432): Database portdatabase: Database nameusername: Database usernamepassword: Database passwordssl(optional, default: false): Enable SSL connectionconnectionTimeoutMillis(optional, default: 30000): Connection timeout in milliseconds
postgres/query
Execute a SQL query against a specific connection.
Parameters:
connectionId: ID of the connection to usequery: SQL query to executeparams(optional): Query parameters
postgres/query-with-options
Execute a SQL query with additional options.
Parameters:
connectionId: ID of the connection to usequery: SQL query to executeparams(optional): Query parametersoptions(optional):timeout: Query timeout in millisecondsvalidate: Whether to validate the query for dangerous keywords
postgres/query-paginated
Execute a paginated SQL query.
Parameters:
connectionId: ID of the connection to usequery: SQL query to execute (without LIMIT/OFFSET)params(optional): Query parameterspage(optional, default: 1): Page number (1-indexed)pageSize(optional, default: 10): Number of records per page (max 1000)
postgres/disconnect
Disconnect from a PostgreSQL database.
Parameters:
connectionId: ID of the connection to close
postgres/connection-info
Get information about a specific connection.
Parameters:
connectionId: ID of the connection
postgres/list-connections
Get a list of all active connections.
Parameters: None
postgres/check-health
Check the health of a specific connection.
Parameters:
connectionId: ID of the connection to check
Usage Examples
Basic Database Connection and Query
import { createServer } from 'postgresql-mcp';
const server = await createServer();
// Connect to database
const connectionResult = await server.request({
method: 'postgres/connect',
params: {
host: 'localhost',
port: 5432,
database: 'myapp',
username: 'user',
password: 'password',
},
});
// Execute a simple query
const queryResult = await server.request({
method: 'postgres/query',
params: {
connectionId: connectionResult.result.connectionId,
query: 'SELECT * FROM users WHERE active = $1',
params: { 1: true },
},
});Paginated Queries for Large Result Sets
// Fetch users with pagination
const paginatedResult = await server.request({
method: 'postgres/query-paginated',
params: {
connectionId: 'conn-1',
query: 'SELECT * FROM products ORDER BY created_at DESC',
page: 1,
pageSize: 50,
},
});
console.log(
`Page ${paginatedResult.result.page} of ${paginatedResult.result.totalPages}`,
);
console.log(`Found ${paginatedResult.result.totalRecords} total records`);Bulk Data Operations
// Bulk insert multiple records
const users = [
{ name: 'Alice', email: '[email protected]', department: 'Engineering' },
{ name: 'Bob', email: '[email protected]', department: 'Sales' },
{ name: 'Charlie', email: '[email protected]', department: 'Marketing' },
];
const insertResult = await server.request({
method: 'postgres/bulk-insert',
params: {
connectionId: 'conn-1',
table: 'users',
data: users,
},
});
console.log(`Inserted ${insertResult.result.insertedCount} records`);Working with JSON Data
// Search within JSON columns
const jsonSearch = await server.request({
method: 'postgres/search-json',
params: {
connectionId: 'conn-1',
table: 'products',
jsonColumn: 'metadata',
searchValue: 'premium',
},
});
// Update JSON data
const jsonUpdate = await server.request({
method: 'postgres/update-json',
params: {
connectionId: 'conn-1',
table: 'products',
jsonColumn: 'metadata',
jsonData: { tier: 'premium', features: ['fast-delivery', 'support'] },
condition: 'id = $1',
params: { 1: 123 },
},
});Database Synchronization
// Sync data between databases
const syncResult = await server.request({
method: 'postgres/sync-data',
params: {
sourceConnectionId: 'prod-db',
targetConnectionId: 'staging-db',
syncConfig: {
tables: ['users', 'products', 'orders'],
syncDirection: 'source_to_target',
conflictResolution: 'source_wins',
batchSize: 1000,
},
},
});
console.log(`Synced ${syncResult.result.synchronizedTables.length} tables`);
console.log(
`Records synced: ${JSON.stringify(syncResult.result.recordsSynced)}`,
);Connection Management and Monitoring
// List all active connections
const connections = await server.request({
method: 'postgres/list-connections',
});
// Check connection health
const healthCheck = await server.request({
method: 'postgres/check-health',
params: {
connectionId: 'conn-1',
},
});
// Get connection statistics
const connInfo = await server.request({
method: 'postgres/connection-info',
params: {
connectionId: 'conn-1',
},
});
console.log(`Connection uptime: ${connInfo.result.connectionInfo.uptime}ms`);
console.log(`Queries executed: ${connInfo.result.connectionInfo.queryCount}`);Asynchronous Query Processing
// Start a long-running query asynchronously
const jobResult = await server.request({
method: 'postgres/query-async',
params: {
connectionId: 'conn-1',
query: 'SELECT * FROM large_table WHERE complex_condition = $1',
params: { 1: 'value' },
},
});
// Check job status
const jobStatus = await server.request({
method: 'postgres/get-job-status',
params: {
connectionId: 'conn-1',
jobId: jobResult.result.jobId,
},
});
// Get results when complete
if (jobStatus.result.status === 'completed') {
const results = await server.request({
method: 'postgres/get-job-results',
params: {
connectionId: 'conn-1',
jobId: jobResult.result.jobId,
},
});
}Security
The server includes query validation to prevent potentially dangerous operations like DROP, DELETE, ALTER, and CREATE statements when the validate option is enabled.
Configuration
Environment Variables
The server supports configuration through environment variables. You can set these in your shell or use a .env file in the project root:
export PGHOST=localhost
export PGPORT=5432
export PGDATABASE=myapp
export PGUSER=username
export PGPASSWORD=password
export PGSSLMODE=requireAlternatively, create a .env file in your project root with the following format:
PGHOST=localhost
PGPORT=5432
PGDATABASE=myapp
PGUSER=username
PGPASSWORD=password
PGSSLMODE=requireFor a complete example, see the .env.example file in the project root.
Client Configuration
Add to your MCP client configuration:
{
"mcpServers": {
"postgres": {
"command": "npx",
"args": ["postgresql-mcp"],
"env": {
"NODE_ENV": "production"
}
}
}
}Connection Options
- SSL Configuration: Support for
rejectUnauthorized,cert,key, andcaoptions - Timeout Settings: Configurable connection and query timeouts
- Pooling: Connection pooling with configurable limits
- Retry Logic: Automatic retry with exponential backoff
Development
Setup
# Clone the repository
git clone https://github.com/bleeding-gums/postgres-mcp.git
cd postgres-mcp
# Install dependencies
npm install
# Set up environment
cp .env.example .env
# Edit .env with your database configurationRunning in Development
# Development with hot reload
npm run dev
# Development with debug logging
DEBUG=mcp:* npm run dev
# Test against specific database
PGDATABASE=test_db npm run devTesting
# Run all tests
npm test
# Run tests with coverage
npm run test:coverage
# Run integration tests only
npm run test:integration
# Run unit tests only
npm run test:unit
# Run with specific database
PGHOST=localhost PGDATABASE=test npm testBuilding
# Build for production
npm run build
# Build with type checking
npm run build:typecheck
# Build and run linting
npm run build:lintDebugging
# Enable verbose logging
DEBUG=mcp:* npm run dev
# Run with Node.js inspector
node --inspect dist/index.js
# Test specific MCP methods
echo '{"jsonrpc":"2.0","id":1,"method":"postgres/list-connections"}' | node dist/index.jsLicense
MIT
