npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@streamkap/tools

v0.4.7

Published

Streamkap CLI & MCP server - manage CDC pipelines, sources, destinations, and transforms

Readme

Streamkap CLI & MCP Server

CLI and MCP server for Streamkap -- manage real-time data pipelines from the command line or through AI agents.

Streamkap captures changes from databases (MySQL, PostgreSQL, MongoDB, SQL Server, DynamoDB, and more) and streams them to data warehouses, lakes, and other destinations in real time.

Requires Node.js 20+. For full documentation, visit docs.streamkap.com.


Quick Start

  1. Log into your Streamkap dashboard
  2. Go to Settings > API Keys and create a new API key
  3. Choose how you want to use it:

CLI:

npm install -g @streamkap/tools
streamkap auth login --client-id your-client-id --client-secret your-client-secret
streamkap doctor   # Verify everything works

MCP -- Claude Code (one command):

claude mcp add --scope user \
  --header "X-Streamkap-Client-ID: your-client-id" \
  --header "X-Streamkap-Client-Secret: your-client-secret" \
  --transport http \
  streamkap https://mcp.streamkap.com/mcp

MCP -- Cursor, Windsurf, VS Code Copilot (JSON config):

{
  "mcpServers": {
    "streamkap": {
      "type": "http",
      "url": "https://mcp.streamkap.com/mcp",
      "headers": {
        "X-Streamkap-Client-ID": "your-client-id",
        "X-Streamkap-Client-Secret": "your-client-secret"
      }
    }
  }
}

Claude Desktop users: Claude Desktop requires a different setup using the absolute path to your Node.js binary. See the Claude Desktop guide on docs.streamkap.com for the full instructions.


What's Included

| Credentials | What you get | |-------------|--------------| | API key only | All REST tools -- pipelines, sources, destinations, transforms, topics, schemas, alerts, logs | | + Kafka user | REST tools + direct Kafka produce / consume / subscribe + Schema Registry encode and decode | | Project Key | All of the above, bundled into a single base64 value with tool scoping |

Most users only need an API key. For Kafka access plus pre-scoped tool access in a single credential, use a Project Key.


Project Keys

A Project Key is one file that bundles API + Kafka + Schema Registry credentials and MCP tool scoping. Use it instead of setting each env var or header individually.

Create one at Settings > Project Keys in the Streamkap dashboard, then encode it:

streamkap auth encode-key ~/Downloads/my-key-credentials.json

CLI / stdio MCP: set STREAMKAP_PROJECT_KEY=<base64>. Remote MCP:

{
  "mcpServers": {
    "streamkap": {
      "type": "http",
      "url": "https://mcp.streamkap.com/mcp",
      "headers": { "X-Streamkap-Project-Key": "<base64>" }
    }
  }
}

Full reference -- JSON shape, tool-scoping profiles, rotation, edit flow: docs.streamkap.com/project-keys.


CLI

Install

npm install -g @streamkap/tools

Authenticate

# Environment variables (recommended for CI and scripts)
export STREAMKAP_CLIENT_ID="your-client-id"
export STREAMKAP_CLIENT_SECRET="your-client-secret"

# Or save credentials to a config file
streamkap auth login --client-id your-client-id --client-secret your-client-secret

# Or pass per command
streamkap pipelines list --client-id your-client-id --client-secret your-client-secret --json

Common commands

streamkap --help                          # List all commands
streamkap doctor                          # Validate API, Kafka, Schema Registry
streamkap pipelines list --json           # List pipelines
streamkap sources metrics <id> --json     # Source metrics
streamkap dashboard stats --json          # Organisation overview

Direct Kafka commands (require Kafka credentials -- see Adding Kafka Access):

streamkap kafka produce <topic> --value '{"key":"val"}'   # Produce a single message
streamkap kafka consume <topic> --max-messages 10          # Consume a batch
streamkap kafka subscribe <topic> --timeout 30000          # Real-time subscribe

Output format: JSON when piped, text when interactive. Override with --json or --format text.

Destructive commands (delete, stop, reset) require --yes in interactive terminals. Preview with --dry-run. When piped (scripts and agents), they run without confirmation.

Shell completions

streamkap completions bash >> ~/.bashrc
streamkap completions zsh >> ~/.zshrc
streamkap completions fish > ~/.config/fish/completions/streamkap.fish

For the full CLI reference see docs.streamkap.com/cli.


MCP Server

The MCP server lets AI agents manage your Streamkap infrastructure through natural language. Detailed setup for every supported client is at docs.streamkap.com/mcp-server.

Claude Code

claude mcp add --scope user \
  --header "X-Streamkap-Client-ID: your-client-id" \
  --header "X-Streamkap-Client-Secret: your-client-secret" \
  --transport http \
  streamkap https://mcp.streamkap.com/mcp

Cursor, Windsurf

Add to .cursor/mcp.json (Cursor) or ~/.codeium/windsurf/mcp_config.json (Windsurf):

{
  "mcpServers": {
    "streamkap": {
      "type": "http",
      "url": "https://mcp.streamkap.com/mcp",
      "headers": {
        "X-Streamkap-Client-ID": "your-client-id",
        "X-Streamkap-Client-Secret": "your-client-secret"
      }
    }
  }
}

VS Code Copilot

Add to .vscode/mcp.json (note the different schema -- servers instead of mcpServers):

{
  "servers": {
    "streamkap": {
      "type": "http",
      "url": "https://mcp.streamkap.com/mcp",
      "headers": {
        "X-Streamkap-Client-ID": "your-client-id",
        "X-Streamkap-Client-Secret": "your-client-secret"
      }
    }
  }
}

Claude Desktop

Claude Desktop requires the absolute path to your Node.js binary because it does not source your shell environment. See the Claude Desktop setup guide on docs.streamkap.com for the full configuration.

Example prompts

Once connected, ask your AI agent things like:

  • "Give me an overview of my infrastructure"
  • "Are any of my pipelines broken? Show me the details"
  • "Check the logs for any errors in the last hour"
  • "Produce a test message to my-topic"
  • "Find all DLQ topics and check for errors"

Capabilities

  • Pipelines -- create, update, delete, monitor metrics and logs, bulk operations
  • Sources -- manage CDC connectors (MySQL, PostgreSQL, MongoDB, etc.), deploy, pause, resume, restart, stop, snapshots
  • Destinations -- manage sinks (Snowflake, BigQuery, ClickHouse, etc.), deploy, pause, resume, restart, stop, monitor lag
  • Transforms -- manage stream processors, deploy to preview or production, run unit tests, clone
  • Topics -- list, inspect, create Kafka topics, read sample messages
  • Tags -- organise and search resources by tag
  • Schema Registry -- browse subjects and schemas
  • Consumer Groups -- inspect lag, identify stuck consumers, reset offsets
  • Dashboard & Logs -- organisation statistics, data lineage, search and filter logs
  • Alerts -- manage notification subscribers and preferences
  • Usage -- query and export usage metrics
  • Kafka Access -- manage direct-Kafka users
  • Cluster Scaling -- inspect cluster status, scale up or down
  • Direct Kafka -- produce and consume messages with optional Schema Registry encoding (Avro, JSON Schema, Protobuf)

Adding Kafka Access

Or use a Project Key -- one value instead of the individual env vars below.

To enable the direct Kafka tools and Schema Registry encoding, create a Kafka user from the Kafka Access page in your Streamkap dashboard. The dashboard gives you the bootstrap servers, username, and password. All three are required together — Streamkap's Kafka proxy always requires SASL/SSL, so partial credentials will fail at startup with a clear error.

Add them to your existing config:

{
  "env": {
    "STREAMKAP_CLIENT_ID": "your-client-id",
    "STREAMKAP_CLIENT_SECRET": "your-client-secret",
    "KAFKA_BOOTSTRAP_SERVERS": "your-kafka-proxy:9092",
    "KAFKA_API_KEY": "your-kafka-username",
    "KAFKA_API_SECRET": "your-kafka-password",
    "SCHEMA_REGISTRY_URL": "https://your-schema-registry:8081"
  }
}

For the CLI, export the same values as environment variables in your shell.


Environment Variables

Core

Set a Project Key or a Client ID + Client Secret pair (individual vars win on conflict).

| Variable | Description | |----------|-------------| | STREAMKAP_PROJECT_KEY | Base64-encoded Project Key (see Project Keys) | | STREAMKAP_CLIENT_ID | Streamkap API client ID | | STREAMKAP_CLIENT_SECRET | Streamkap API client secret | | STREAMKAP_API_URL | Override the API base URL (default https://api.streamkap.com) |

Kafka (optional)

The three Kafka variables below must be set together — bootstrap servers alone is not enough. Create a Kafka user from the Kafka Access page in your Streamkap dashboard to get all three at once.

| Variable | Description | |----------|-------------| | KAFKA_BOOTSTRAP_SERVERS | Kafka broker addresses (required when using direct Kafka tools) | | KAFKA_API_KEY | Kafka SASL username (required when using direct Kafka tools) | | KAFKA_API_SECRET | Kafka SASL password (required when using direct Kafka tools) | | SCHEMA_REGISTRY_URL | Schema Registry URL -- enables Avro / JSON Schema / Protobuf encode and decode | | SCHEMA_REGISTRY_USERNAME | Schema Registry basic auth username | | SCHEMA_REGISTRY_PASSWORD | Schema Registry basic auth password |

Tool filtering (optional)

Shrink the catalog an agent sees — useful for context-constrained clients (Flink Agents, small-context models) and for scoping what a key can do.

| Variable | Description | |----------|-------------| | MCP_TOOL_PROFILE | full (default), read-only, agent-operator, or infra-admin | | MCP_ALLOW_TOOLS | Comma-separated whitelist — when set, only these tools appear in tools/list | | MCP_BLOCK_TOOLS | Comma-separated blacklist — removed from the catalog |

Blocked tools are hidden from tools/list, not just rejected at call time. Project Keys embed these fields directly.


License

Elastic License 2.0. See LICENSE.