@yglabs/databricks-mcp
v0.1.0
Published
Wrap a Fabric data agent HTTP MCP server as a local stdio MCP server.
Maintainers
Readme
databricks-mcp
Wrap a Databricks HTTP MCP server as a local stdio MCP server.
Quick Start
Run directly with npx:
npx -y @yglabs/databricks-mcp@latest --url <databricks-mcp-url>Global install:
npm install -g @yglabs/databricks-mcp
databricks-mcp --url <databricks-mcp-url>Prebuilt npm binaries currently ship for:
win-x64win-arm64linux-x64linux-arm64osx-x64osx-arm64
Usage
databricks-mcp --url <databricks-mcp-url> [--account <upn>] [--client-id <id>]
databricks-mcp mcp --url <databricks-mcp-url> [--account <upn>] [--client-id <id>]
databricks-mcp versionOptions:
--url: required upstream HTTP MCP server URL--account: optional login hint--client-id: optional OAuth client identifier, defaults todatabricks-cli
Example:
databricks-mcp --url https://adb-3675736172811358.18.azuredatabricks.net/api/2.0/mcp/genie/01f13d4d9c151fba809b20ec1957d6eaBehavior
- Local transport is stdio MCP.
- Upstream transport is HTTP MCP.
- Requests and notifications are proxied to the upstream server.
- JSON-RPC request ids are preserved when requests are forwarded upstream.
initializeis forwarded upstream, but the returnedserverInfois rewritten to identify this local stdio server. Other initialize result fields, such ascapabilitiesandinstructions, are preserved from upstream.- Databricks Genie MCP endpoints expose tools through
tools/list; they do not necessarily expose MCP resources or resource templates. - Each local process proxies one upstream
--url. - To use multiple upstream servers at the same time, run multiple local processes with different
--urlvalues.
MCP Client Configuration
[mcp_servers.databricks_mcp]
enabled = true
command = "npx"
args = ["-y", "@yglabs/databricks-mcp@latest", "--url", "https://adb-3675736172811358.18.azuredatabricks.net/api/2.0/mcp/genie/01f13d4d9c151fba809b20ec1957d6ea"]Authentication
Authentication uses a generic OIDC authorization-code flow with PKCE, while keeping the disk cache protected with the same cross-platform secure storage library family used by @microsoft/workiq.
- OAuth authority and scopes are discovered from the upstream HTTP MCP server challenge and protected-resource metadata.
- Scope selection uses the first advertised non-identity API scope, and also requests
offline_accesswhen the upstream metadata advertises it. - Interactive sign-in always uses the system browser and the fixed loopback redirect URI
http://localhost:8020by default; override the port withDATABRICKS_MCP_REDIRECT_PORTif needed. - Persisted default account hints use
defaultAccount:<hash>keys in~/.databricks-mcp/.databricks-mcp.json, scoped byauthority + clientId + scopes. - Token cache data is stored under
~/.databricks-mcp/oidc_token_cache.datusing OS-protected persistence throughMicrosoft.Identity.Client.Extensions.Msal.
Development
Build the project:
dotnet build .\src\DatabricksMcp\DatabricksMcp.csprojTrace logging:
- Set
DATABRICKS_MCP_TRACE=1to enable trace logging. - By default, trace output is written to
~/.databricks-mcp/trace.log. - Set
DATABRICKS_MCP_TRACE_FILEto override the trace file path. - Set
DATABRICKS_MCP_TRACE_STDERR=1only when you also want trace lines mirrored tostderr.
Publish native binaries into the npm package layout:
npm run publish:npmThe default publish script builds all supported RIDs listed above.
Limit local packaging to a single RID:
npm run publish:npm -- --rids win-x64Verify the npm tarball layout and executable shim:
npm run release:verifyCreate a local tarball:
npm run pack:npmPublish to npm:
npm run publish:registryOverride the staged package name or version during packing or publish:
npm run pack:npm -- --package-name @yglabs/databricks-mcp --package-version 0.1.0