@mothership/rebill-sdk
v0.1.4
Published
TypeScript SDK for querying Rebill model predictions from Databricks
Readme
@mothership/rebill-sdk
TypeScript SDK for querying Rebill model predictions from Databricks.
Installation
pnpm add @mothership/rebill-model-sdkOr with npm:
npm install @mothership/rebill-model-sdkPrerequisites
1. Create a Service Principal in Databricks
You'll need OAuth M2M credentials (service principal) to authenticate with Databricks:
- Go to your Databricks workspace
- Navigate to Settings → Identity and Access → Service Principals
- Click Add Service Principal
- Note the Application ID (this is your
clientId) - Generate an OAuth Secret (this is your
clientSecret) - Grant the service principal appropriate permissions:
- CAN USE on the SQL Warehouse
- SELECT permission on
rebill_model.v1.predictionstable
2. Get Your SQL Warehouse ID
- Navigate to SQL Warehouses in Databricks
- Select teh rebill_model warehouse
- Copy the Warehouse ID from the URL or settings
Usage
Basic Example
import { RebillClient, getRecordByEdiInvoiceInboundId } from '@mothership/rebill-sdk';
async function main() {
// Create client with OAuth M2M credentials
const client = new RebillClient({
workspaceUrl: 'https://mothership-production.cloud.databricks.com',
clientId: process.env.DATABRICKS_CLIENT_ID!,
clientSecret: process.env.DATABRICKS_CLIENT_SECRET!,
warehouseId: process.env.DATABRICKS_WAREHOUSE_ID!,
});
try {
// Connect to Databricks
await client.connect();
// Query a prediction by EDI invoice ID
const record = await getRecordByEdiInvoiceInboundId(client, '12345');
if (record) {
console.log('Model:', record.model_name, 'v' + record.model_version);
console.log('Customer Summary:', record.prediction.customer_summary);
console.log('Line Items:', record.prediction.rebill_line_items.length);
// Show details of each rebill line item
record.prediction.rebill_line_items.forEach((item, index) => {
console.log(`\nLine Item ${index + 1}:`);
console.log(' Type:', item.type);
console.log(' Amount: $', item.amount);
console.log(' Description:', item.line_item_description);
console.log(' Root Cause:', item.root_cause);
});
// Calculate total amount from line items
const totalAmount = record.prediction.rebill_line_items.reduce(
(sum, item) => sum + item.amount,
0
);
console.log('\nTotal Amount: $', totalAmount.toFixed(2));
} else {
console.log('No prediction found');
}
} finally {
// Always close the connection
await client.close();
}
}
main().catch(console.error);Query Multiple Records
The SDK efficiently batches multiple IDs into a single Databricks query for optimal performance:
import { RebillClient, getRecordsByEdiInvoiceInboundIds } from '@mothership/rebill-sdk';
const client = new RebillClient({ /* config */ });
await client.connect();
const ids = ['12345', '67890', '11111'];
// Makes a single batched query to Databricks
const records = await getRecordsByEdiInvoiceInboundIds(client, ids);
console.log(`Found ${records.length} predictions`);
records.forEach(record => {
const types = record.prediction.rebill_line_items.map(item => item.type).join(', ');
const total = record.prediction.rebill_line_items.reduce((sum, item) => sum + item.amount, 0);
console.log(`${record.edi_invoice_inbound_id}: ${types} ($${total.toFixed(2)})`);
});
await client.close();Query Options
// Don't parse prediction JSON (return as string)
const record = await getRecordByEdiInvoiceInboundId(client, '12345', {
parsePrediction: false
});
// Custom timeout
const record = await getRecordByEdiInvoiceInboundId(client, '12345', {
timeout: 60000 // 60 seconds
});API Reference
RebillClient
Main client class for interacting with Databricks.
Constructor
new RebillClient(config: OAuthConfig)Methods
connect(): Promise<void>- Initialize connection to Databricksclose(): Promise<void>- Close the connection
Query Functions
getRecordByEdiInvoiceInboundId(client, ediInvoiceInboundId, options?)- Query a single prediction by EDI invoice inbound IDgetRecordsByEdiInvoiceInboundIds(client, ediInvoiceInboundIds, options?)- Query multiple records in a single batched query
Types
interface OAuthConfig {
workspaceUrl: string;
clientId: string;
clientSecret: string;
warehouseId: string;
}
interface PredictionRecord {
inference_id: string;
edi_invoice_inbound_id: string;
model_name: string;
model_version: string;
input_hash: string;
prediction_json: string;
processed_at: Date;
}
interface ParsedPrediction extends Omit<PredictionRecord, 'prediction_json'> {
prediction: RebillPrediction;
}
interface RebillLineItem {
type: string;
amount: number;
line_item_description: string;
root_cause: string;
supporting_documentation: string;
}
interface RebillPrediction {
rebill_line_items: RebillLineItem[];
customer_summary: string;
}
interface QueryOptions {
parsePrediction?: boolean;
timeout?: number;
}Error Handling
The SDK throws RebillSDKError for all errors:
import { RebillClient, getRecordByEdiInvoiceInboundId, RebillSDKError } from '@mothership/rebill-sdk';
try {
const record = await getRecordByEdiInvoiceInboundId(client, '12345');
} catch (error) {
if (error instanceof RebillSDKError) {
console.error('SDK Error:', error.code, error.message);
console.error('Original Error:', error.originalError);
}
}Error codes:
CONNECTION_ERROR- Failed to connect to DatabricksNOT_CONNECTED- Attempted to query without connectingQUERY_ERROR- Query execution failedPARSE_ERROR- Failed to parse prediction JSONCLOSE_ERROR- Failed to close connection
Environment Variables
It's recommended to store credentials in environment variables:
# .env
DATABRICKS_CLIENT_ID=your-service-principal-id
DATABRICKS_CLIENT_SECRET=your-service-principal-secret
DATABRICKS_WAREHOUSE_ID=your-warehouse-id
DATABRICKS_WORKSPACE_URL=https://mothership-production.cloud.databricks.comThen use with dotenv:
import 'dotenv/config';
import { RebillClient } from '@mothership/rebill-sdk';
const client = new RebillClient({
workspaceUrl: process.env.DATABRICKS_WORKSPACE_URL!,
clientId: process.env.DATABRICKS_CLIENT_ID!,
clientSecret: process.env.DATABRICKS_CLIENT_SECRET!,
warehouseId: process.env.DATABRICKS_WAREHOUSE_ID!,
});Development
Setup
# Install dependencies
pnpm install
# Set up git hooks (husky)
pnpm prepareBuilding
# Build the SDK
pnpm build
# Type check without emitting files
pnpm build:check
# Watch mode for development
pnpm watchTesting
# Run tests once
pnpm test
# Run tests in watch mode
pnpm test:watch
# Run tests with coverage report
pnpm test:coverageLinting & Formatting
The SDK uses Biome for linting and formatting:
# Check formatting
pnpm format:check
# Fix formatting issues
pnpm format:fix
# Check for linting issues
pnpm lint:check
# Fix linting issues automatically
pnpm lint:fix
# Run both linting and formatting checks
pnpm lintPre-commit Hooks
The project uses Husky to run linting automatically before commits. The pre-commit hook will:
- Run
pnpm lint:fixto automatically fix linting and formatting issues - Prevent the commit if there are unfixable issues
Clean
# Remove build artifacts and coverage reports
pnpm clean