agentbill-vercel
v1.0.0
Published
OpenTelemetry-based provider wrapper for automatically tracking and billing Vercel AI SDK usage.
Downloads
6
Readme
AgentBill Vercel AI SDK Integration
OpenTelemetry-based provider wrapper for automatically tracking and billing Vercel AI SDK usage.
Installation
npm install @agentbill/vercel-sdk aiWith specific AI SDK providers:
npm install @ai-sdk/openai @ai-sdk/anthropicQuick Start
import { AgentBillVercel } from '@agentbill/vercel-sdk';
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
// 1. Initialize AgentBill wrapper
const agentBill = new AgentBillVercel({
apiKey: 'agb_your_api_key_here', // Get from AgentBill dashboard
baseUrl: 'https://bgwyprqxtdreuutzpbgw.supabase.co',
customerId: 'customer-123',
debug: true
});
// 2. Wrap your AI SDK provider
const trackedOpenAI = agentBill.wrapProvider(openai);
// 3. Use normally - everything auto-tracked!
const { text } = await generateText({
model: trackedOpenAI('gpt-4o-mini'),
prompt: 'What is the capital of France?',
});
console.log(text);
// ✅ Automatically captured:
// - Prompt text (hashed for privacy)
// - Model name (gpt-4o-mini)
// - Provider (openai)
// - Token usage (prompt + completion)
// - Latency (ms)
// - Costs (calculated automatically)Features
- ✅ Zero-config instrumentation - Just wrap the provider
- ✅ Automatic token tracking - Captures all AI SDK calls
- ✅ Multi-provider support - OpenAI, Anthropic, any AI SDK provider
- ✅ Streaming support - Tracks token-by-token streaming
- ✅ Cost calculation - Auto-calculates costs per model
- ✅ Prompt profitability - Compare costs vs revenue
- ✅ OpenTelemetry compatible - Standard observability
Advanced Usage
Streaming with Tracking
import { streamText } from 'ai';
const result = await streamText({
model: trackedOpenAI('gpt-4o-mini'),
prompt: 'Write a poem about code',
});
// Stream to client - all tokens tracked!
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
// Automatically tracked:
// - Total streaming tokens
// - Time to first token
// - Total latencyTrack Custom Revenue
// Track revenue for profitability analysis
agentBill.trackRevenue({
event_name: 'ai_completion',
revenue: 0.50, // What you charged the customer
metadata: { subscription_tier: 'pro' }
});Use with Multiple Providers
import { anthropic } from '@ai-sdk/anthropic';
const trackedAnthropic = agentBill.wrapProvider(anthropic);
// Mix and match providers - all tracked!
const openaiResult = await generateText({
model: trackedOpenAI('gpt-4o'),
prompt: 'Hello from OpenAI'
});
const anthropicResult = await generateText({
model: trackedAnthropic('claude-3-5-sonnet-20241022'),
prompt: 'Hello from Anthropic'
});Next.js API Route Example
// app/api/chat/route.ts
import { AgentBillVercel } from '@agentbill/vercel-sdk';
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
const agentBill = new AgentBillVercel({
apiKey: process.env.AGENTBILL_API_KEY!,
baseUrl: process.env.AGENTBILL_BASE_URL!,
debug: process.env.NODE_ENV === 'development'
});
export async function POST(req: Request) {
const { messages } = await req.json();
const trackedOpenAI = agentBill.wrapProvider(openai);
// Stream response with automatic tracking
const result = await streamText({
model: trackedOpenAI('gpt-4o-mini'),
messages,
});
return result.toDataStreamResponse();
}Configuration
const agentBill = new AgentBillVercel({
apiKey: 'agb_...', // Required - get from dashboard
baseUrl: 'https://...', // Required - your AgentBill instance
customerId: 'customer-123', // Optional - for multi-tenant apps
accountId: 'account-456', // Optional - for account-level tracking
debug: true, // Optional - enable debug logging
batchSize: 10, // Optional - batch signals before sending
flushInterval: 5000 // Optional - flush interval in ms
});How It Works
The provider wrapper intercepts AI SDK calls:
- Wrap Provider - Creates a proxy around the AI SDK provider
- Track Call - Captures model, prompt, start time
- Execute - Runs the actual AI SDK call
- Capture Response - Extracts tokens, latency, response
- Send Signal - Sends data to AgentBill via
record-signalsAPI
All data is batched and sent efficiently to minimize overhead.
Supported Models
Auto-cost calculation for:
- OpenAI: GPT-4o, GPT-4o-mini, GPT-4-turbo, GPT-3.5-turbo
- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
- Any AI SDK-compatible provider
TypeScript Support
Full TypeScript support with proper typing:
import type { AgentBillConfig } from '@agentbill/vercel-sdk';
const config: AgentBillConfig = {
apiKey: 'agb_...',
baseUrl: 'https://...',
customerId: 'customer-123'
};Troubleshooting
Not seeing data in dashboard?
- Check API key is correct
- Enable
debug: trueto see logs - Verify
baseUrlmatches your instance - Check network connectivity to AgentBill
TypeScript errors?
Make sure you have the latest version: npm install @agentbill/vercel-sdk@latest
Streaming not working?
The wrapper supports streaming - make sure you're using the latest AI SDK version
License
MIT
