npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@kyuda/n8n-nodes-databricks-simplified

v1.0.0

Published

Comprehensive Databricks integration for n8n — SQL, Unity Catalog, Jobs, Clusters, Files, Genie, Vector Search, and Model Serving

Readme

@kyuda/n8n-nodes-databricks

Comprehensive Databricks integration for n8n. Provides nodes for SQL, Unity Catalog, Jobs, Clusters, Files, Genie AI, Vector Search, Model Serving, and LangChain AI Agent / Chat / Embedding / Vector Store nodes.

Features

  • Genie AI Assistant -- Start conversations, send messages, and execute SQL queries through Databricks' AI assistant
  • Databricks SQL -- Execute SQL queries with automatic chunked result retrieval
  • Unity Catalog -- Manage catalogs, schemas, tables, volumes, and functions
  • Model Serving -- Query deployed ML/LLM endpoints with auto-detected input format
  • Files -- Upload, download, list, and manage files in Unity Catalog volumes (up to 5 GiB)
  • Vector Search -- Semantic similarity search with reranking support
  • Jobs -- Run, create, list, cancel, and monitor Databricks job runs
  • Clusters -- List, create, start, stop, and delete Databricks compute clusters
  • LmChatDatabricks -- LangChain Chat Model node for Databricks Foundation Model endpoints
  • EmbeddingsDatabricks -- LangChain Embeddings node for Databricks embedding endpoints
  • VectorStoreDatabricks -- LangChain Vector Store node backed by Databricks Vector Search
  • DatabricksAiAgent -- Full AI Agent node with optional MLflow tracing

Prerequisites

  • Node.js 18+
  • pnpm
  • A Databricks workspace with credentials (PAT or OAuth Service Principal)

Installation

For Development

git clone https://github.com/kyudahq/n8n.git
cd n8n/@kyuda/n8n-nodes-databricks
pnpm install
pnpm build

Link to your n8n installation:

npm link
cd ~/.n8n/custom
npm link @kyuda/n8n-nodes-databricks

From npm

npm install @kyuda/n8n-nodes-databricks

Credentials

Two authentication methods are supported:

Personal Access Token (default)

  1. Host -- Your Databricks workspace URL (e.g. https://adb-1234567890123456.7.azuredatabricks.net)
  2. Token -- A Databricks personal access token (generate at User Settings > Access Tokens)

OAuth M2M (Service Principal)

  1. Host -- Your Databricks workspace URL
  2. Client ID -- The application (client) ID of a Databricks service principal
  3. Client Secret -- The secret associated with the service principal

The OAuth flow uses grant_type=client_credentials against your workspace's /oidc/v1/token endpoint. Tokens are cached and refreshed automatically.

Resources and Operations

Databricks (main node)

| Resource | Operations | |---|---| | Genie | Start Conversation, Create Message, Get Message, Execute Message Query, Get Query Results, Get Space | | Databricks SQL | Execute Query | | Unity Catalog | Create/Get/Update/Delete Catalog, List Catalogs, Create/Get/Update/Delete Schema, List Schemas, Create/Get/Delete Volume, List Volumes, Get/List Tables, Create/Get/Delete Function, List Functions | | Model Serving | Query Endpoint (auto-detects chat/completions/embeddings/dataframe formats) | | Files | Upload File, Download File, Delete File, Get File Info, List Directory, Create Directory, Delete Directory | | Vector Search | Query Index, Get Index, List Indexes | | Jobs | Run Job, Get Run Status, Get Run Output, Create Job, List Jobs, Cancel Run, List Runs | | Clusters | List Clusters, Get Cluster, Create Cluster, Start Cluster, Stop Cluster, Delete Cluster |

LangChain Nodes

| Node | Description | |---|---| | LmChatDatabricks | Chat model for Databricks Foundation Model APIs or external model endpoints | | EmbeddingsDatabricks | Text embeddings via Databricks serving endpoints | | VectorStoreDatabricks | Vector store backed by Databricks Vector Search indexes | | DatabricksAiAgent | AI agent with tool calling, memory, structured output, streaming, and optional MLflow tracing |

Architecture

@kyuda/n8n-nodes-databricks/
├── credentials/
│   └── Databricks.credentials.ts      # PAT + OAuth M2M
├── nodes/
│   ├── Databricks/
│   │   ├── Databricks.node.ts          # Main node (slim dispatcher)
│   │   ├── helpers.ts                  # Shared HTTP helper, cache, security utils
│   │   └── resources/
│   │       ├── index.ts                # Barrel exports
│   │       ├── files/                  # operations, parameters, handler
│   │       ├── genie/                  # operations, parameters, handler
│   │       ├── databricksSql/          # operations, parameters, handler
│   │       ├── modelServing/           # operations, parameters, handler
│   │       ├── unityCatalog/           # operations, parameters, handler
│   │       ├── vectorSearch/           # operations, parameters, handler
│   │       ├── jobs/                   # operations, parameters, handler
│   │       └── clusters/              # operations, parameters, handler
│   ├── agents/DatabricksAiAgent/       # AI Agent with MLflow
│   ├── llms/LmChatDatabricks/          # LangChain Chat Model
│   ├── embeddings/EmbeddingsDatabricks/# LangChain Embeddings
│   └── vector_store/VectorStoreDatabricks/ # LangChain Vector Store
└── utils/
    ├── DatabricksVectorStoreLangChain.ts
    ├── N8nTool.ts
    ├── N8nBinaryLoader.ts
    └── descriptions.ts

Execution Flow

User selects Resource → Operation → fills Parameters
        ↓
    execute() dispatcher (Databricks.node.ts)
        ↓
    handlers[resource](ctx, operation, itemIndex)
        ↓
    Per-resource handler file (e.g. resources/jobs/handler.ts)
        ↓
    databricksApiRequest() (shared helper with auth, host normalization)
        ↓
    Databricks REST API

Security

  • Token masking -- Bearer tokens and PAT values are scrubbed from all error messages and logs
  • URL encoding -- All user-provided path segments are passed through encodeURIComponent
  • Path traversal protection -- File operations reject paths containing ..
  • Cache isolation -- Cache keys include a hash of the credential token, preventing cross-user cache leaks

Databricks AI Agent

The Databricks AI Agent node provides a LangChain-based agent with:

  • Tool Calling -- Supports any LangChain tool or MCP toolkit
  • Memory -- Conversation history via BaseChatMemory
  • Structured Output -- Optional output parser for validated JSON
  • Streaming -- Real-time token streaming support
  • Fallback Models -- Automatic failover to a secondary model
  • MLflow Tracing -- Optional observability with automatic experiment management

When MLflow is enabled, traces appear under /Shared/n8n-workflows-{workflow-id} with spans for agent execution, LLM calls, and tool invocations.

Development

pnpm build      # Build the node
pnpm test       # Run tests
pnpm lint       # Lint
pnpm lintfix    # Auto-fix lint issues

Contributing

Contributions are welcome. Please submit a Pull Request at github.com/kyudahq/n8n.

Resources

License

MIT