@assistant-billing/sdk
v0.0.4
Published
Client SDK for Assistant Billing AI Gateway — OpenAI-compatible API with smart routing, billing, and analytics
Readme
@assistant-billing/sdk
Client SDK for Assistant Billing AI Gateway.
Drop-in replacement for the OpenAI SDK — just change the base URL and API key.
Install
npm install @assistant-billing/sdkEnvironment Variables
The SDK reads these automatically — no configuration needed:
ASSISTANT_BILLING_GATEWAY_API_KEY=sk_ab_...
ASSISTANT_BILLING_GATEWAY_API_URL=https://your-gateway.example.com # optionalQuick Start
import { GatewayClient } from "@assistant-billing/sdk";
// Reads ASSISTANT_BILLING_GATEWAY_API_KEY and ASSISTANT_BILLING_GATEWAY_API_URL from env
const client = new GatewayClient();
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);Vercel AI SDK Integration
Use @assistant-billing/sdk/ai-sdk for seamless integration with the Vercel AI SDK:
import { createGateway } from "@assistant-billing/sdk/ai-sdk";
import { generateText } from "ai";
// Reads env vars automatically
const gateway = createGateway();
const { text } = await generateText({
model: gateway("gpt-4o"),
prompt: "Hello!",
});Or use the pre-configured default instance:
import { gateway } from "@assistant-billing/sdk/ai-sdk";
import { streamText } from "ai";
const result = streamText({
model: gateway("claude-sonnet-4-20250514"),
prompt: "Write a haiku",
});You can also pass options explicitly:
const gateway = createGateway({
apiKey: "sk_ab_...",
baseURL: "https://your-gateway.example.com/v1",
});Streaming
const stream = await client.chat.completions.create({
model: "claude-sonnet-4-20250514",
messages: [{ role: "user", content: "Write a haiku" }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}Embeddings
const embeddings = await client.embeddings.create({
model: "text-embedding-3-small",
input: "Hello world",
});List Models
const models = await client.models.list();Context Builders
Chain context for request tracking:
const response = await client
.withUser("user-123")
.withMetadata({ session: "abc" })
.withProvider("openai")
.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello" }],
});Feedback & Scores
Submit a single feedback score:
await client.feedback("hrl_xxx", 5);Submit multiple scores at once (including boolean scores):
await client.scores("hrl_xxx", {
quality: 5,
relevance: 3,
thumbs_up: true,
});Configuration
const client = new GatewayClient({
apiKey: "sk_ab_...", // or ASSISTANT_BILLING_GATEWAY_API_KEY env var
baseURL: "https://your-gateway.example.com", // or ASSISTANT_BILLING_GATEWAY_API_URL env var
timeout: 30_000, // optional, default 60s
defaultHeaders: { "X-Custom": "value" }, // optional
});Works with OpenAI SDK
You can also use the official OpenAI SDK directly:
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.ASSISTANT_BILLING_GATEWAY_API_KEY,
baseURL: "https://your-gateway.example.com/v1",
});License
AGPL-3.0
