mcp-toolbox
v1.0.0
Published
MCP (Model Context Protocol) server toolbox for [Squadbase](https://www.squadbase.dev).
Downloads
65
Readme
MCP Toolbox
MCP (Model Context Protocol) server toolbox for Squadbase.
Each connector is published as a separate package, so you only install what you need.
Packages
| Package | Description |
|---------|-------------|
| @squadbase/mcp | Launcher CLI |
| @squadbase/mcp-databricks | Databricks connector |
| @squadbase/mcp-redshift | Amazon Redshift connector |
| @squadbase/mcp-athena | Amazon Athena connector |
| @squadbase/mcp-airtable | Airtable connector |
Usage
Direct Execution
npx -y @squadbase/mcp-databricks
npx -y @squadbase/mcp-redshift
npx -y @squadbase/mcp-athena
npx -y @squadbase/mcp-airtableVia Launcher
npx -y @squadbase/mcp databricks
npx -y @squadbase/mcp redshift
npx -y @squadbase/mcp athena
npx -y @squadbase/mcp airtableMCP Client Configuration
For MCP clients like Claude Desktop, add to your config:
{
"mcpServers": {
"databricks": {
"command": "npx",
"args": ["-y", "@squadbase/mcp", "databricks"],
"env": {
"DATABRICKS_HOST": "your-workspace.cloud.databricks.com",
"DATABRICKS_TOKEN": "your-token",
"DATABRICKS_HTTP_PATH": "/sql/1.0/warehouses/abc123"
}
},
"redshift": {
"command": "npx",
"args": ["-y", "@squadbase/mcp", "redshift"],
"env": {
"REDSHIFT_AWS_ACCESS_KEY_ID": "your-access-key-id",
"REDSHIFT_AWS_SECRET_ACCESS_KEY": "your-secret-access-key",
"REDSHIFT_AWS_REGION": "ap-northeast-1",
"REDSHIFT_DATABASE": "dev",
"REDSHIFT_WORKGROUP_NAME": "your-workgroup-name"
}
},
"athena": {
"command": "npx",
"args": ["-y", "@squadbase/mcp", "athena"],
"env": {
"ATHENA_AWS_ACCESS_KEY_ID": "your-access-key-id",
"ATHENA_AWS_SECRET_ACCESS_KEY": "your-secret-access-key",
"ATHENA_AWS_REGION": "ap-northeast-1",
"ATHENA_WORKGROUP": "primary"
}
},
"airtable": {
"command": "npx",
"args": ["-y", "@squadbase/mcp-airtable"],
"env": {
"AIRTABLE_BASE_ID": "your-base-id",
"AIRTABLE_API_KEY": "your-api-key"
}
}
}
}Environment Variables
Databricks
| Variable | Description | Required |
|----------|-------------|----------|
| DATABRICKS_HOST | Workspace hostname | Yes |
| DATABRICKS_TOKEN | Personal access token | Yes |
| DATABRICKS_HTTP_PATH | SQL warehouse HTTP path | For SQL queries |
Redshift (Data API)
Redshift uses AWS Redshift Data API for connection. Supports both provisioned clusters and serverless.
Common (Required)
| Variable | Description |
|----------|-------------|
| REDSHIFT_AWS_ACCESS_KEY_ID | AWS access key ID |
| REDSHIFT_AWS_SECRET_ACCESS_KEY | AWS secret access key |
| REDSHIFT_AWS_REGION | AWS region (e.g., ap-northeast-1) |
| REDSHIFT_DATABASE | Database name |
Provisioned Cluster
| Variable | Description | Required |
|----------|-------------|----------|
| REDSHIFT_CLUSTER_IDENTIFIER | Cluster identifier | Yes |
| REDSHIFT_SECRET_ARN | Secrets Manager ARN for authentication | No (use DB_USER if not set) |
| REDSHIFT_DB_USER | Database user name | No (required if SECRET_ARN not set) |
Serverless
| Variable | Description | Required |
|----------|-------------|----------|
| REDSHIFT_WORKGROUP_NAME | Workgroup name | Yes |
| REDSHIFT_SECRET_ARN | Secrets Manager ARN for authentication | No |
Athena
Athena uses AWS Athena API for serverless SQL queries on S3 data.
Common (Required)
| Variable | Description |
|----------|-------------|
| ATHENA_AWS_ACCESS_KEY_ID | AWS access key ID |
| ATHENA_AWS_SECRET_ACCESS_KEY | AWS secret access key |
| ATHENA_AWS_REGION | AWS region (e.g., ap-northeast-1) |
Query Execution (One Required)
| Variable | Description |
|----------|-------------|
| ATHENA_WORKGROUP | Athena workgroup name (recommended) |
| ATHENA_OUTPUT_LOCATION | S3 location for query results (e.g., s3://bucket/path/) |
Optional
| Variable | Default | Description |
|----------|---------|-------------|
| ATHENA_CATALOG | AwsDataCatalog | Data catalog name |
| ATHENA_DATABASE | - | Default database name |
| ATHENA_POLL_INTERVAL_MS | 1000 | Polling interval in milliseconds |
| ATHENA_TIMEOUT_MS | 60000 | Query timeout in milliseconds |
Airtable
| Variable | Description | Required |
|----------|-------------|----------|
| AIRTABLE_BASE_ID | Airtable base ID | Yes |
| AIRTABLE_API_KEY | Airtable personal access token | Yes |
Available Tools
Databricks
ping- Health checkdatabricks_info- Connection infoexecute_sql- Execute SQL querylist_catalogs- List Unity Catalog catalogslist_schemas- List schemas in a cataloglist_tables- List tables in a schemadescribe_table- Get table detailssample_table- Get sample rows
Redshift
ping- Health checkredshift_info- Connection infoexecute_sql- Execute SQL querylist_databases- List databaseslist_schemas- List schemaslist_tables- List tablesdescribe_table- Get table detailssample_table- Get sample rowsget_table_stats- Get table statistics
Athena
ping- Health checkathena_info- Connection infoexecute_sql- Execute SQL querylist_catalogs- List data catalogslist_databases- List databases in a cataloglist_tables- List tables in a databasedescribe_table- Get table details including partition keyssample_table- Get sample rowsget_table_stats- Get table statistics
Airtable
ping- Health checkairtable_info- Connection infolist_tables- List all tables with schema informationdescribe_table- Get detailed table schemaget_records- Get records with filtering and pagination
Security Considerations
Arbitrary SQL Execution
The execute_sql tool allows execution of arbitrary SQL queries against your database. Ensure that:
- Database credentials have appropriate permissions (principle of least privilege)
- Consider using read-only database users for exploratory use cases
- This MCP server is designed for use in trusted environments where the caller (LLM) is controlled
Identifier Escaping
All schema, table, and catalog names are properly escaped to prevent SQL injection attacks. The tools use:
- PostgreSQL-style double-quote escaping for Redshift and Athena
- Backtick escaping for Databricks
Development
# Install dependencies
npm install
# Build all packages
npm run build
# Test a server locally
node packages/mcp-databricks/dist/cli.js
node packages/mcp-redshift/dist/cli.js
node packages/mcp-athena/dist/cli.js
node packages/mcp-airtable/dist/cli.jsAdding a New Connector
- Create a new package in
packages/mcp-{connector-name}/ - Implement the MCP server with
@modelcontextprotocol/sdk - Add the connector to the launcher in
packages/mcp/src/index.ts
Publishing
This project uses Changesets for version management and publishing.
Understanding Changesets
Changesets tracks changes across packages in a monorepo. The workflow is:
- Create changeset - Record what changed and how it affects versions
- Version - Apply changesets to update package.json versions and CHANGELOG.md
- Publish - Build and publish packages to the registry
Authentication (Required before publishing)
This project publishes to npmjs.com. Before running npm run release, you must authenticate:
npm loginIf you have 2FA enabled with security key only, use:
npm publish -w @squadbase/mcp-{name} --auth-type=webAdding a New Package
When adding a new package to the monorepo:
Create the package directory structure under
packages/mcp-{name}/Set initial version to
0.0.0inpackage.json:{ "name": "@squadbase/mcp-{name}", "version": "0.0.0" }Create a changeset for the initial release:
npm run changeset- Select the new package
- Choose
minorfor initial release (0.0.0 -> 0.1.0) orpatchfor 0.0.1 - Write a description: "Initial release of @squadbase/mcp-{name}"
Apply the version:
npm run versionThis updates
package.jsonversion and creates/updatesCHANGELOG.mdCommit the changes and publish:
git add . git commit -m "chore: release @squadbase/mcp-{name}" npm run release
Updating an Existing Package
When making changes to existing packages:
Make your code changes
Create a changeset describing the changes:
npm run changeset- Select affected package(s)
- Choose version bump type:
patch(0.1.0 -> 0.1.1): Bug fixes, documentationminor(0.1.0 -> 0.2.0): New features, non-breaking changesmajor(0.1.0 -> 1.0.0): Breaking changes
- Write a clear description of changes
Commit your changes with the changeset file:
git add . git commit -m "feat: add new feature to mcp-{name}"When ready to release, apply versions and publish:
npm run version # Updates versions and CHANGELOGs git add . git commit -m "chore: version packages" npm run release # Builds and publishes
Version Bump Guidelines
| Change Type | Bump | Example |
|-------------|------|---------|
| Bug fix | patch | Fix SQL escaping issue |
| New tool/feature | minor | Add describe_column tool |
| New environment variable | minor | Add timeout configuration |
| Breaking API change | major | Rename tool or change parameters |
| Initial release | minor | New package at 0.1.0 |
Troubleshooting
CHANGELOG.md out of sync with package.json:
- The
package.jsonversion is the source of truth - Run
npm run versionto regenerate CHANGELOG from pending changesets
Forgot to create changeset:
- You can create changesets retroactively before running
npm run version - The changeset just needs to exist before versioning
Multiple changes before release:
- Create multiple changeset files (one per logical change)
- They will be combined when running
npm run version
License
MIT
