npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@mothership/rebill-model-api

v0.5.1

Published

TypeScript API for querying Rebill model predictions from Databricks

Readme

@mothership/rebill-model-api

TypeScript API for querying Rebill model predictions from Databricks.

Prerequisites

1. Create a Service Principal in Databricks

You'll need OAuth M2M credentials (service principal) to authenticate with Databricks:

  1. Go to your Databricks workspace
  2. Navigate to SettingsIdentity and AccessService Principals
  3. Click Add Service Principal
  4. Note the Application ID (this is your clientId)
  5. Generate an OAuth Secret (this is your clientSecret)
  6. Grant the service principal appropriate permissions:
    • CAN USE on the SQL Warehouse
    • SELECT permission on rebill_model.v1.predictions table

2. Get Your SQL Warehouse ID

  1. Navigate to SQL Warehouses in Databricks
  2. Select the rebill_model warehouse
  3. Copy the Warehouse ID from the URL or settings

Usage

Basic Usage

import { RebillModelClient } from '@mothership/rebill-model-api';

async function main() {
  // Initialize client with configuration
  const client = new RebillModelClient({
    workspaceUrl: process.env.DATABRICKS_WORKSPACE_URL!,
    clientId: process.env.DATABRICKS_CLIENT_ID!,
    clientSecret: process.env.DATABRICKS_CLIENT_SECRET!,
    warehouseId: process.env.DATABRICKS_WAREHOUSE_ID!,
  });

  try {
    const predictions = await client.getLatestByEdiInboundIds(['12345', '67890']);
    
    for (const prediction of predictions) {
      console.log('Rebill Line Items:', prediction.rebill_line_items.length);
      console.log('Summary:', prediction.customer_summary);
    }
  } finally {
    // Clean up connection when done
    await client.close();
  }
}

main().catch(console.error);

Available Methods

// As additional documents are received by carriers or the model or its configuration are
// changed, the model will re-run to update results for rebills that are not in a
// terminal state.

// Get the latest result from model by EDI ID
await client.getLatestByEdiInboundIds([ediInboundId1, ediInboundId2]);

// Get all results from the model by EDI Invoice IDs
await client.getByEdiInboundIds([ediInboundId1, ediInboundId2]);


// Get all results from the model by Inference IDs (unqiue ID created for each result)
await client.getByInferenceIds([inferenceId1, inferenceId2]);

Query Options

// Custom timeout in milliseconds (default: 30000)
const predictions = await client.getByEdiInboundIds(ediIds, { 
  timeoutMs: 60000 // 60 seconds
});

Types

interface RebillPrediction {
  inference_id: string;
  edi_invoice_inbound_id: string;
  processed_at: Date;
  rebill_line_items: RebillLineItem[];
  customer_summary: string;
  unstructured_response: string;
}

interface RebillLineItem {
  type: string;  // e.g., "INCORRECT_WEIGHT", "LIFT_GATE_AT_DELIVERY"
  amount: number;
  line_item_description: string;
  root_cause: string;
  supporting_documentation: string;
}

interface QueryOptions {
  timeoutMs?: number;  // Default: 30000 (30 seconds)
}

Error Handling

The API throws RebillException for all errors:

Error Codes:

  • DATABRICKS_CONNECTION_FAILED - Failed to connect to Databricks
  • SESSION_NOT_INITIALIZED - Internal session state error
  • QUERY_EXECUTION_FAILED - Query execution failed
  • QUERY_TIMEOUT - Query exceeded timeout duration
  • RESPONSE_PARSE_FAILED - Failed to parse JSON returned in query

Development

Setup

# Install dependencies
pnpm install

# Set up git hooks (husky)
pnpm add-hooks

The pre-commit hook will:

  • Run pnpm lint:fix to automatically fix linting and formatting issues
  • Prevent the commit if there are unfixable issues

Building

# Build the API
pnpm build

# Watch mode for development
pnpm watch

Testing

# Run tests once
pnpm test

# Run tests in watch mode
pnpm test:watch

# Run tests with coverage report
pnpm test:coverage

Linting & Formatting

The API uses Biome for linting and formatting:

# Check/fix formatting
pnpm format:check
pnpm format:fix

# Check/fix linting issues
pnpm lint:check
pnpm lint:fix

# Run both linting and formatting checks
pnpm lint

Clean

# Remove build artifacts and coverage reports
pnpm clean