@sentrial/sdk
v0.4.4
Published
Sentrial SDK for TypeScript/Node.js - AI agent observability and monitoring
Readme
@sentrial/sdk
TypeScript SDK for Sentrial - AI agent observability and monitoring.
Track sessions, tool calls, and metrics to power:
- Signal detection: Auto-detect patterns and anomalies
- Root cause analysis: Understand WHY agents fail
- Experiments: A/B test different system prompts
- Code fixer: AI-suggested fixes with GitHub PRs
Installation
npm install @sentrial/sdk
# or
pnpm add @sentrial/sdk
# or
yarn add @sentrial/sdkQuick Start (30 seconds)
Option 1: Auto-Wrap LLM Clients (Recommended)
import OpenAI from 'openai';
import { wrapOpenAI, withSession, configure } from '@sentrial/sdk';
configure({ apiKey: 'sentrial_live_xxx' });
// Wrap once - all calls auto-tracked!
const openai = wrapOpenAI(new OpenAI());
const myAgent = withSession('support-agent', async (userId: string, message: string) => {
// LLM calls automatically tracked with tokens, cost, latency
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: message }],
});
return response.choices[0].message.content;
});
await myAgent('user_123', 'How do I reset my password?');Option 2: Decorators / Higher-Order Functions
import { withTool, withSession, configure } from '@sentrial/sdk';
configure({ apiKey: 'sentrial_live_xxx' });
// Track tools
const searchKB = withTool('search_kb', async (query: string) => {
return { articles: ['KB-001', 'KB-002'] };
});
// Track sessions
const supportAgent = withSession('support-agent', async (userId: string, message: string) => {
const results = await searchKB(message); // Auto-tracked!
return `Found ${results.articles.length} articles`;
});Option 3: Vercel AI SDK
import { configure, wrapAISDK } from '@sentrial/sdk';
import * as ai from 'ai';
import { openai } from '@ai-sdk/openai';
configure({ apiKey: process.env.SENTRIAL_API_KEY, defaultAgent: 'my-agent' });
const { generateText, streamText } = wrapAISDK(ai);
// All calls automatically traced!
const { text } = await generateText({
model: openai('gpt-4'),
prompt: 'What is the capital of France?',
});Features
LLM Auto-Wrappers
Wrap once, track everything automatically:
import { wrapOpenAI, wrapAnthropic, wrapGoogle, wrapLLM } from '@sentrial/sdk';
// OpenAI
const openai = wrapOpenAI(new OpenAI());
// Anthropic
const anthropic = wrapAnthropic(new Anthropic());
// Google Gemini
const model = wrapGoogle(genAI.getGenerativeModel({ model: 'gemini-2.0-flash' }));
// Auto-detect
const client = wrapLLM(new OpenAI()); // Detects OpenAI, Anthropic, or GoogleDecorators
import { withTool, withSession, Tool, TrackSession } from '@sentrial/sdk';
// Higher-order functions
const searchWeb = withTool('web_search', async (query: string) => {...});
const myAgent = withSession('my-agent', async (userId, message) => {...});
// Class decorators (with experimentalDecorators)
class MyAgent {
@Tool('search')
async searchWeb(query: string) {...}
@TrackSession('my-agent')
async handleRequest(userId: string, message: string) {...}
}Experiments
A/B test different system prompts:
import { Experiment, getSystemPrompt, isExperimentMode } from '@sentrial/sdk';
const experiment = new Experiment('exp_abc123');
await experiment.load();
await experiment.run(async (testCase, variant, tracker) => {
const systemPrompt = getSystemPrompt('Default prompt');
const response = await runAgent(testCase.userInput, systemPrompt);
tracker.setResultSessionId(sessionId);
});Vercel AI SDK Integration
Full support for Vercel AI SDK v3-v6:
import { wrapAISDK } from '@sentrial/sdk';
import * as ai from 'ai';
import { z } from 'zod';
const { generateText, streamText, generateObject, streamObject } = wrapAISDK(ai);
// With tools
const { text } = await generateText({
model: openai('gpt-4'),
prompt: "What's the weather?",
tools: {
getWeather: {
description: 'Get weather',
parameters: z.object({ location: z.string() }),
execute: async ({ location }) => ({ temp: 72 }), // Auto-tracked!
},
},
});Configuration
Environment Variables
SENTRIAL_API_URL=https://api.sentrial.com
SENTRIAL_API_KEY=sentrial_live_xxxProgrammatic
import { configure } from '@sentrial/sdk';
configure({
apiKey: 'sentrial_live_xxx',
apiUrl: 'https://api.sentrial.com', // Optional
});Full API
For maximum control:
import { SentrialClient } from '@sentrial/sdk';
const client = new SentrialClient({ apiKey: '...' });
// Create session
const sessionId = await client.createSession({
name: 'Support Request',
agentName: 'support-agent',
userId: 'user_123',
});
// Track tool calls
await client.trackToolCall({
sessionId,
toolName: 'search_kb',
toolInput: { query: 'password reset' },
toolOutput: { articles: ['KB-001'] },
reasoning: 'User asked about password reset',
});
// Track decisions
await client.trackDecision({
sessionId,
reasoning: 'Found relevant article, sharing with user',
alternatives: ['Escalate', 'Ask clarifying question'],
confidence: 0.9,
});
// Complete session
await client.completeSession({
sessionId,
success: true,
customMetrics: { satisfaction: 4.5 },
promptTokens: 1500,
completionTokens: 500,
});Cost Calculation
import { calculateOpenAICost, calculateAnthropicCost, calculateGoogleCost } from '@sentrial/sdk';
const cost = calculateOpenAICost({
model: 'gpt-4o',
inputTokens: 1000,
outputTokens: 500,
});
// Returns: 0.0075What Gets Tracked
| Data | Auto-tracked | Manual |
|------|-------------|--------|
| LLM calls | ✔️ via wrappers | trackToolCall() |
| Token usage | ✔️ via wrappers | promptTokens param |
| Cost (USD) | ✔️ calculated | estimatedCost param |
| Latency | ✔️ always | - |
| Tool calls | ✔️ via withTool | trackToolCall() |
| Errors | ✔️ always | trackError() |
| User ID | ✔️ via session | userId param |
| Custom metrics | - | customMetrics param |
Framework Compatibility
| Framework | Integration | Status |
|-----------|-------------|--------|
| Direct OpenAI | wrapOpenAI() | ✔️ |
| Direct Anthropic | wrapAnthropic() | ✔️ |
| Direct Gemini | wrapGoogle() | ✔️ |
| Vercel AI SDK | wrapAISDK() | ✔️ |
| Express/Fastify | Decorators | ✔️ |
| Next.js | Decorators | ✔️ |
| Custom agents | Full API | ✔️ |
Documentation
See the docs/ folder in the repository:
docs/sdks/typescript-sdk.md- Full SDK referencedocs/integrations/vercel-ai-sdk.md- Vercel AI SDK integrationdocs/features/experiments.md- Experiments guide
Support
License
MIT
