npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@databricks/lakebase

v0.1.1

Published

PostgreSQL driver for Databricks Lakebase with automatic OAuth token refresh

Readme

@databricks/lakebase

PostgreSQL driver for Databricks Lakebase Autoscaling with automatic OAuth token refresh.

Overview

@databricks/lakebase provides a drop-in replacement for the standard pg connection pool that automatically handles OAuth authentication for Databricks Lakebase Autoscaling (OLTP) databases.

It:

  • Returns a standard pg.Pool - works with any PostgreSQL library or ORM
  • Automatically refreshes OAuth tokens (1-hour lifetime, with 2-minute buffer)
  • Caches tokens to minimize API calls
  • Zero configuration with environment variables
  • Optional OpenTelemetry instrumentation

NOTE: This package is NOT compatible with the Databricks Lakebase Provisioned.

Installation

npm install @databricks/lakebase

Quick Start

Using Environment Variables

Set the following environment variables:

export PGHOST=your-lakebase-host.databricks.com
export PGDATABASE=your_database_name
export LAKEBASE_ENDPOINT=projects/{project-id}/branches/{branch-id}/endpoints/{endpoint-id}
export PGUSER=your-service-principal-id
export PGSSLMODE=require

Then use the driver:

import { createLakebasePool } from "@databricks/lakebase";

const pool = createLakebasePool();
const result = await pool.query("SELECT * FROM users");
console.log(result.rows);

With Explicit Configuration

import { createLakebasePool } from "@databricks/lakebase";

const pool = createLakebasePool({
  host: "your-lakebase-host.databricks.com",
  database: "your_database_name",
  endpoint:
    "projects/{project-id}/branches/{branch-id}/endpoints/{endpoint-id}",
  user: "service-principal-id", // Optional, defaults to DATABRICKS_CLIENT_ID
  max: 10, // Connection pool size
});

Authentication

The driver supports Databricks authentication via:

  1. Default auth chain (.databrickscfg, environment variables)
  2. Service principal (DATABRICKS_CLIENT_ID + DATABRICKS_CLIENT_SECRET)
  3. OAuth tokens (via Databricks SDK)

See Databricks authentication docs for configuration.

Configuration

| Option | Environment Variable | Description | Default | | ------------------------- | ---------------------------------- | --------------------------------------- | ----------------------- | | host | PGHOST | Lakebase host | Required | | database | PGDATABASE | Database name | Required | | endpoint | LAKEBASE_ENDPOINT | Endpoint resource path | Required | | user | PGUSER or DATABRICKS_CLIENT_ID | Username or service principal ID | Auto-detected | | port | PGPORT | Port number | 5432 | | sslMode | PGSSLMODE | SSL mode | require | | max | - | Max pool connections | 10 | | idleTimeoutMillis | - | Idle connection timeout | 30000 | | connectionTimeoutMillis | - | Connection timeout | 10000 | | logger | - | Logger instance or config | { error: true } |

Logging

By default, the driver logs errors only. You can configure logging in three ways:

1. Config-Based Logger (Simple)

Enable/disable specific log levels using boolean flags:

import { createLakebasePool } from "@databricks/lakebase";

// Development mode: enable debug and error logs
const pool = createLakebasePool({
  logger: { debug: true, error: true },
});

// Production mode: errors only (same as default)
const pool = createLakebasePool({
  logger: { error: true },
});

// Verbose mode: all logs enabled
const pool = createLakebasePool({
  logger: { debug: true, info: true, warn: true, error: true },
});

// Silent mode: all logs disabled
const pool = createLakebasePool({
  logger: { debug: false, info: false, warn: false, error: false },
});

2. Custom Logger (Advanced)

Inject your own logger implementation for custom formatting or integrations:

const logger = {
  debug: (msg: string, ...args: unknown[]) => console.debug(msg, ...args),
  info: (msg: string, ...args: unknown[]) => console.log(msg, ...args),
  warn: (msg: string, ...args: unknown[]) => console.warn(msg, ...args),
  error: (msg: string, ...args: unknown[]) => console.error(msg, ...args),
};

const pool = createLakebasePool({ logger });

3. Default Behavior

If no logger is provided, the driver defaults to error-only logging:

// These are equivalent:
const pool1 = createLakebasePool();
const pool2 = createLakebasePool({ logger: { error: true } });

When used with AppKit, logging is automatically configured - see the AppKit Integration section.

ORM Examples

Drizzle ORM

import { drizzle } from "drizzle-orm/node-postgres";
import { createLakebasePool } from "@databricks/lakebase";

const pool = createLakebasePool();
const db = drizzle(pool);

const users = await db.select().from(usersTable);

Prisma

import { PrismaPg } from "@prisma/adapter-pg";
import { PrismaClient } from "@prisma/client";
import { createLakebasePool } from "@databricks/lakebase";

const pool = createLakebasePool();
const adapter = new PrismaPg(pool);
const prisma = new PrismaClient({ adapter });

const users = await prisma.user.findMany();

TypeORM

import { DataSource } from "typeorm";
import { createLakebasePool } from "@databricks/lakebase";

const pool = createLakebasePool();

const dataSource = new DataSource({
  type: "postgres",
  synchronize: true,
  ...getLakebaseOrmConfig(),
  entities: [
    // Your entity classes
  ],
});

await dataSource.initialize();

Sequelize

import { Sequelize } from "sequelize";
import { getLakebaseOrmConfig } from "@databricks/lakebase";

const sequelize = new Sequelize({
  dialect: "postgres",
  ...getLakebaseOrmConfig(),
});

OpenTelemetry Integration

The driver automatically uses OpenTelemetry's global registry when available. If your application initializes OpenTelemetry providers, the driver will automatically instrument queries and metrics with no additional configuration needed.

Setup

Install OpenTelemetry in your application:

npm install @opentelemetry/api @opentelemetry/sdk-node

Initialize OpenTelemetry in your application:

import { NodeSDK } from "@opentelemetry/sdk-node";

const sdk = new NodeSDK({
  // Your OTEL configuration
});

sdk.start(); // Registers global providers

// Now create your pool - it automatically uses the global providers
import { createLakebasePool } from "@databricks/lakebase";
const pool = createLakebasePool();

The driver calls trace.getTracer('@databricks/lakebase') and metrics.getMeter('@databricks/lakebase') internally. If no global providers are registered, operations are automatic no-ops.

Metrics Exported

  • lakebase.token.refresh.duration - OAuth token refresh duration (histogram, ms)
  • lakebase.query.duration - Query execution duration (histogram, ms)
  • lakebase.pool.connections.total - Total connections in pool (gauge)
  • lakebase.pool.connections.idle - Idle connections (gauge)
  • lakebase.pool.connections.waiting - Clients waiting for connection (gauge)
  • lakebase.pool.errors - Pool errors by error code (counter)

AppKit Integration

This driver is also available as part of @databricks/appkit:

import { createLakebasePool } from "@databricks/appkit";

const pool = createLakebasePool();

Differences between standalone and AppKit:

  • Standalone (@databricks/lakebase): Silent by default - no logger configured
  • AppKit (@databricks/appkit): Automatically injects AppKit's logger with scope appkit:connectors:lakebase.

Learn more about Lakebase Autoscaling

For Lakebase Autoscaling documentation, see docs.databricks.com/aws/en/oltp/projects.