@streamkap/tools
v0.4.7
Published
Streamkap CLI & MCP server - manage CDC pipelines, sources, destinations, and transforms
Downloads
1,053
Maintainers
Readme
Streamkap CLI & MCP Server
CLI and MCP server for Streamkap -- manage real-time data pipelines from the command line or through AI agents.
Streamkap captures changes from databases (MySQL, PostgreSQL, MongoDB, SQL Server, DynamoDB, and more) and streams them to data warehouses, lakes, and other destinations in real time.
Requires Node.js 20+. For full documentation, visit docs.streamkap.com.
Quick Start
- Log into your Streamkap dashboard
- Go to Settings > API Keys and create a new API key
- Choose how you want to use it:
CLI:
npm install -g @streamkap/tools
streamkap auth login --client-id your-client-id --client-secret your-client-secret
streamkap doctor # Verify everything worksMCP -- Claude Code (one command):
claude mcp add --scope user \
--header "X-Streamkap-Client-ID: your-client-id" \
--header "X-Streamkap-Client-Secret: your-client-secret" \
--transport http \
streamkap https://mcp.streamkap.com/mcpMCP -- Cursor, Windsurf, VS Code Copilot (JSON config):
{
"mcpServers": {
"streamkap": {
"type": "http",
"url": "https://mcp.streamkap.com/mcp",
"headers": {
"X-Streamkap-Client-ID": "your-client-id",
"X-Streamkap-Client-Secret": "your-client-secret"
}
}
}
}Claude Desktop users: Claude Desktop requires a different setup using the absolute path to your Node.js binary. See the Claude Desktop guide on docs.streamkap.com for the full instructions.
What's Included
| Credentials | What you get | |-------------|--------------| | API key only | All REST tools -- pipelines, sources, destinations, transforms, topics, schemas, alerts, logs | | + Kafka user | REST tools + direct Kafka produce / consume / subscribe + Schema Registry encode and decode | | Project Key | All of the above, bundled into a single base64 value with tool scoping |
Most users only need an API key. For Kafka access plus pre-scoped tool access in a single credential, use a Project Key.
Project Keys
A Project Key is one file that bundles API + Kafka + Schema Registry credentials and MCP tool scoping. Use it instead of setting each env var or header individually.
Create one at Settings > Project Keys in the Streamkap dashboard, then encode it:
streamkap auth encode-key ~/Downloads/my-key-credentials.jsonCLI / stdio MCP: set STREAMKAP_PROJECT_KEY=<base64>.
Remote MCP:
{
"mcpServers": {
"streamkap": {
"type": "http",
"url": "https://mcp.streamkap.com/mcp",
"headers": { "X-Streamkap-Project-Key": "<base64>" }
}
}
}Full reference -- JSON shape, tool-scoping profiles, rotation, edit flow: docs.streamkap.com/project-keys.
CLI
Install
npm install -g @streamkap/toolsAuthenticate
# Environment variables (recommended for CI and scripts)
export STREAMKAP_CLIENT_ID="your-client-id"
export STREAMKAP_CLIENT_SECRET="your-client-secret"
# Or save credentials to a config file
streamkap auth login --client-id your-client-id --client-secret your-client-secret
# Or pass per command
streamkap pipelines list --client-id your-client-id --client-secret your-client-secret --jsonCommon commands
streamkap --help # List all commands
streamkap doctor # Validate API, Kafka, Schema Registry
streamkap pipelines list --json # List pipelines
streamkap sources metrics <id> --json # Source metrics
streamkap dashboard stats --json # Organisation overviewDirect Kafka commands (require Kafka credentials -- see Adding Kafka Access):
streamkap kafka produce <topic> --value '{"key":"val"}' # Produce a single message
streamkap kafka consume <topic> --max-messages 10 # Consume a batch
streamkap kafka subscribe <topic> --timeout 30000 # Real-time subscribeOutput format: JSON when piped, text when interactive. Override with --json or --format text.
Destructive commands (delete, stop, reset) require --yes in interactive terminals. Preview with --dry-run. When piped (scripts and agents), they run without confirmation.
Shell completions
streamkap completions bash >> ~/.bashrc
streamkap completions zsh >> ~/.zshrc
streamkap completions fish > ~/.config/fish/completions/streamkap.fishFor the full CLI reference see docs.streamkap.com/cli.
MCP Server
The MCP server lets AI agents manage your Streamkap infrastructure through natural language. Detailed setup for every supported client is at docs.streamkap.com/mcp-server.
Claude Code
claude mcp add --scope user \
--header "X-Streamkap-Client-ID: your-client-id" \
--header "X-Streamkap-Client-Secret: your-client-secret" \
--transport http \
streamkap https://mcp.streamkap.com/mcpCursor, Windsurf
Add to .cursor/mcp.json (Cursor) or ~/.codeium/windsurf/mcp_config.json (Windsurf):
{
"mcpServers": {
"streamkap": {
"type": "http",
"url": "https://mcp.streamkap.com/mcp",
"headers": {
"X-Streamkap-Client-ID": "your-client-id",
"X-Streamkap-Client-Secret": "your-client-secret"
}
}
}
}VS Code Copilot
Add to .vscode/mcp.json (note the different schema -- servers instead of mcpServers):
{
"servers": {
"streamkap": {
"type": "http",
"url": "https://mcp.streamkap.com/mcp",
"headers": {
"X-Streamkap-Client-ID": "your-client-id",
"X-Streamkap-Client-Secret": "your-client-secret"
}
}
}
}Claude Desktop
Claude Desktop requires the absolute path to your Node.js binary because it does not source your shell environment. See the Claude Desktop setup guide on docs.streamkap.com for the full configuration.
Example prompts
Once connected, ask your AI agent things like:
- "Give me an overview of my infrastructure"
- "Are any of my pipelines broken? Show me the details"
- "Check the logs for any errors in the last hour"
- "Produce a test message to my-topic"
- "Find all DLQ topics and check for errors"
Capabilities
- Pipelines -- create, update, delete, monitor metrics and logs, bulk operations
- Sources -- manage CDC connectors (MySQL, PostgreSQL, MongoDB, etc.), deploy, pause, resume, restart, stop, snapshots
- Destinations -- manage sinks (Snowflake, BigQuery, ClickHouse, etc.), deploy, pause, resume, restart, stop, monitor lag
- Transforms -- manage stream processors, deploy to preview or production, run unit tests, clone
- Topics -- list, inspect, create Kafka topics, read sample messages
- Tags -- organise and search resources by tag
- Schema Registry -- browse subjects and schemas
- Consumer Groups -- inspect lag, identify stuck consumers, reset offsets
- Dashboard & Logs -- organisation statistics, data lineage, search and filter logs
- Alerts -- manage notification subscribers and preferences
- Usage -- query and export usage metrics
- Kafka Access -- manage direct-Kafka users
- Cluster Scaling -- inspect cluster status, scale up or down
- Direct Kafka -- produce and consume messages with optional Schema Registry encoding (Avro, JSON Schema, Protobuf)
Adding Kafka Access
Or use a Project Key -- one value instead of the individual env vars below.
To enable the direct Kafka tools and Schema Registry encoding, create a Kafka user from the Kafka Access page in your Streamkap dashboard. The dashboard gives you the bootstrap servers, username, and password. All three are required together — Streamkap's Kafka proxy always requires SASL/SSL, so partial credentials will fail at startup with a clear error.
Add them to your existing config:
{
"env": {
"STREAMKAP_CLIENT_ID": "your-client-id",
"STREAMKAP_CLIENT_SECRET": "your-client-secret",
"KAFKA_BOOTSTRAP_SERVERS": "your-kafka-proxy:9092",
"KAFKA_API_KEY": "your-kafka-username",
"KAFKA_API_SECRET": "your-kafka-password",
"SCHEMA_REGISTRY_URL": "https://your-schema-registry:8081"
}
}For the CLI, export the same values as environment variables in your shell.
Environment Variables
Core
Set a Project Key or a Client ID + Client Secret pair (individual vars win on conflict).
| Variable | Description |
|----------|-------------|
| STREAMKAP_PROJECT_KEY | Base64-encoded Project Key (see Project Keys) |
| STREAMKAP_CLIENT_ID | Streamkap API client ID |
| STREAMKAP_CLIENT_SECRET | Streamkap API client secret |
| STREAMKAP_API_URL | Override the API base URL (default https://api.streamkap.com) |
Kafka (optional)
The three Kafka variables below must be set together — bootstrap servers alone is not enough. Create a Kafka user from the Kafka Access page in your Streamkap dashboard to get all three at once.
| Variable | Description |
|----------|-------------|
| KAFKA_BOOTSTRAP_SERVERS | Kafka broker addresses (required when using direct Kafka tools) |
| KAFKA_API_KEY | Kafka SASL username (required when using direct Kafka tools) |
| KAFKA_API_SECRET | Kafka SASL password (required when using direct Kafka tools) |
| SCHEMA_REGISTRY_URL | Schema Registry URL -- enables Avro / JSON Schema / Protobuf encode and decode |
| SCHEMA_REGISTRY_USERNAME | Schema Registry basic auth username |
| SCHEMA_REGISTRY_PASSWORD | Schema Registry basic auth password |
Tool filtering (optional)
Shrink the catalog an agent sees — useful for context-constrained clients (Flink Agents, small-context models) and for scoping what a key can do.
| Variable | Description |
|----------|-------------|
| MCP_TOOL_PROFILE | full (default), read-only, agent-operator, or infra-admin |
| MCP_ALLOW_TOOLS | Comma-separated whitelist — when set, only these tools appear in tools/list |
| MCP_BLOCK_TOOLS | Comma-separated blacklist — removed from the catalog |
Blocked tools are hidden from tools/list, not just rejected at call time. Project Keys embed these fields directly.
License
Elastic License 2.0. See LICENSE.
