@assistant-hub/sdk
v0.0.4
Published
Client SDK for Assistant Hub AI Gateway — OpenAI-compatible API with smart routing, caching, and analytics
Downloads
411
Readme
@assistant-hub/sdk
Client SDK for Assistant Hub AI Gateway.
Drop-in replacement for the OpenAI SDK — just change the base URL and API key.
Install
npm install @assistant-hub/sdkEnvironment Variables
The SDK reads these automatically — no configuration needed:
ASSISTANT_HUB_API_KEY=sk_hub_...
ASSISTANT_HUB_BASE_URL=https://api.aichat.new # optional, this is the defaultQuick Start
import { HubClient } from "@assistant-hub/sdk";
// Reads ASSISTANT_HUB_API_KEY and ASSISTANT_HUB_BASE_URL from env
const hub = new HubClient();
const response = await hub.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);Vercel AI SDK Integration
Use @assistant-hub/sdk/ai-sdk for seamless integration with the Vercel AI SDK:
import { createHub } from "@assistant-hub/sdk/ai-sdk";
import { generateText } from "ai";
// Reads env vars automatically
const hub = createHub();
const { text } = await generateText({
model: hub("gpt-4o"),
prompt: "Hello!",
});Or use the pre-configured default instance:
import { hub } from "@assistant-hub/sdk/ai-sdk";
import { streamText } from "ai";
const result = streamText({
model: hub("claude-sonnet-4-20250514"),
prompt: "Write a haiku",
});You can also pass options explicitly:
const hub = createHub({
apiKey: "sk_hub_...",
baseURL: "https://your-gateway.example.com/v1",
});Streaming
const stream = await hub.chat.completions.create({
model: "claude-sonnet-4-20250514",
messages: [{ role: "user", content: "Write a haiku" }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}Embeddings
const embeddings = await hub.embeddings.create({
model: "text-embedding-3-small",
input: "Hello world",
});List Models
const models = await hub.models.list();Context Builders
Chain context for request tracking:
const response = await hub
.withUser("user-123")
.withMetadata({ session: "abc" })
.withProvider("openai")
.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello" }],
});Feedback & Scores
Submit a single feedback score:
await hub.feedback("hrl_xxx", 5);Submit multiple scores at once (including boolean scores):
await hub.scores("hrl_xxx", {
quality: 5,
relevance: 3,
thumbs_up: true,
});Configuration
const hub = new HubClient({
apiKey: "sk_hub_...", // or ASSISTANT_HUB_API_KEY env var
baseURL: "https://your-gateway.aichat.new", // or ASSISTANT_HUB_BASE_URL env var
timeout: 30_000, // optional, default 60s
defaultHeaders: { "X-Custom": "value" }, // optional
});Works with OpenAI SDK
You can also use the official OpenAI SDK directly:
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.ASSISTANT_HUB_API_KEY,
baseURL: "https://api.aichat.new/v1",
});License
MIT
