npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@yglabs/databricks-mcp

v0.1.0

Published

Wrap a Fabric data agent HTTP MCP server as a local stdio MCP server.

Readme

databricks-mcp

Wrap a Databricks HTTP MCP server as a local stdio MCP server.

Quick Start

Run directly with npx:

npx -y @yglabs/databricks-mcp@latest --url <databricks-mcp-url>

Global install:

npm install -g @yglabs/databricks-mcp
databricks-mcp --url <databricks-mcp-url>

Prebuilt npm binaries currently ship for:

  • win-x64
  • win-arm64
  • linux-x64
  • linux-arm64
  • osx-x64
  • osx-arm64

Usage

databricks-mcp --url <databricks-mcp-url> [--account <upn>] [--client-id <id>]
databricks-mcp mcp --url <databricks-mcp-url> [--account <upn>] [--client-id <id>]
databricks-mcp version

Options:

  • --url: required upstream HTTP MCP server URL
  • --account: optional login hint
  • --client-id: optional OAuth client identifier, defaults to databricks-cli

Example:

databricks-mcp --url https://adb-3675736172811358.18.azuredatabricks.net/api/2.0/mcp/genie/01f13d4d9c151fba809b20ec1957d6ea

Behavior

  • Local transport is stdio MCP.
  • Upstream transport is HTTP MCP.
  • Requests and notifications are proxied to the upstream server.
  • JSON-RPC request ids are preserved when requests are forwarded upstream.
  • initialize is forwarded upstream, but the returned serverInfo is rewritten to identify this local stdio server. Other initialize result fields, such as capabilities and instructions, are preserved from upstream.
  • Databricks Genie MCP endpoints expose tools through tools/list; they do not necessarily expose MCP resources or resource templates.
  • Each local process proxies one upstream --url.
  • To use multiple upstream servers at the same time, run multiple local processes with different --url values.

MCP Client Configuration

[mcp_servers.databricks_mcp]
enabled = true
command = "npx"
args = ["-y", "@yglabs/databricks-mcp@latest", "--url", "https://adb-3675736172811358.18.azuredatabricks.net/api/2.0/mcp/genie/01f13d4d9c151fba809b20ec1957d6ea"]

Authentication

Authentication uses a generic OIDC authorization-code flow with PKCE, while keeping the disk cache protected with the same cross-platform secure storage library family used by @microsoft/workiq.

  • OAuth authority and scopes are discovered from the upstream HTTP MCP server challenge and protected-resource metadata.
  • Scope selection uses the first advertised non-identity API scope, and also requests offline_access when the upstream metadata advertises it.
  • Interactive sign-in always uses the system browser and the fixed loopback redirect URI http://localhost:8020 by default; override the port with DATABRICKS_MCP_REDIRECT_PORT if needed.
  • Persisted default account hints use defaultAccount:<hash> keys in ~/.databricks-mcp/.databricks-mcp.json, scoped by authority + clientId + scopes.
  • Token cache data is stored under ~/.databricks-mcp/oidc_token_cache.dat using OS-protected persistence through Microsoft.Identity.Client.Extensions.Msal.

Development

Build the project:

dotnet build .\src\DatabricksMcp\DatabricksMcp.csproj

Trace logging:

  • Set DATABRICKS_MCP_TRACE=1 to enable trace logging.
  • By default, trace output is written to ~/.databricks-mcp/trace.log.
  • Set DATABRICKS_MCP_TRACE_FILE to override the trace file path.
  • Set DATABRICKS_MCP_TRACE_STDERR=1 only when you also want trace lines mirrored to stderr.

Publish native binaries into the npm package layout:

npm run publish:npm

The default publish script builds all supported RIDs listed above.

Limit local packaging to a single RID:

npm run publish:npm -- --rids win-x64

Verify the npm tarball layout and executable shim:

npm run release:verify

Create a local tarball:

npm run pack:npm

Publish to npm:

npm run publish:registry

Override the staged package name or version during packing or publish:

npm run pack:npm -- --package-name @yglabs/databricks-mcp --package-version 0.1.0