@mode-7/exceptra
v0.1.0
Published
TypeScript SDK for Exceptra - LLM observability and logging
Maintainers
Readme
@mode-7/exceptra
TypeScript SDK for Exceptra - LLM observability and logging.
Installation
npm install @mode-7/exceptraQuick Start
import { Exceptra } from "@mode-7/exceptra";
const exceptra = new Exceptra({
apiKey: process.env.EXCEPTRA_API_KEY!,
applicationId: process.env.EXCEPTRA_APP_ID!,
});
// Simple logging
await exceptra.info("User signed up");
await exceptra.error("Something went wrong");
// LLM observability
await exceptra.agent("Chat completion", {
model: "gpt-4",
provider: "openai",
messages: [
{ role: "user", content: "Hello" },
{ role: "assistant", content: "Hi!" },
],
});Configuration
const exceptra = new Exceptra({
// Required
apiKey: "your-api-key",
applicationId: "your-app-id",
// Optional
baseUrl: "https://api.exceptra.com", // Custom API URL
environment: "production", // Environment name
timeout: 10000, // Request timeout (ms)
throwOnError: false, // Throw on API errors
batching: false, // Enable event batching
batchInterval: 1000, // Batch flush interval (ms)
batchSize: 100, // Max batch size
});Logging Methods
All logging methods accept a message (required) and an optional options object.
Method Signatures
exceptra.debug(message: string, options?: SimpleLogOptions)
exceptra.info(message: string, options?: SimpleLogOptions)
exceptra.warn(message: string, options?: SimpleLogOptions)
exceptra.error(message: string | Error, options?: SimpleLogOptions)
exceptra.fatal(message: string | Error, options?: SimpleLogOptions)
exceptra.agent(message: string, llm: ExceptraLLM, options?: AgentLogOptions)Basic Examples
// No options - just a message
await exceptra.info("Server started");
// With options
await exceptra.error("Database connection failed", {
metadata: { host: "db.example.com", port: 5432 },
});
// Error objects are supported
try {
await riskyOperation();
} catch (err) {
await exceptra.error(err);
}Optional Parameters (SimpleLogOptions)
All fields below are optional. Only include what you need.
interface SimpleLogOptions {
environment?: string;
trace_id?: string;
session_id?: string;
metadata?: Record<string, unknown>;
file?: string;
line?: number;
user?: ExceptraUser;
request?: ExceptraRequest;
security?: ExceptraSecurity;
user_feedback_score?: number;
user_feedback_text?: string;
}Full Example with All Options
await exceptra.info("User performed action", {
// Environment (optional) - defaults to config.environment
environment: "production",
// Trace ID (optional) - for correlating related events
trace_id: "req-abc-123",
// Session ID (optional) - for grouping user sessions
session_id: "sess-xyz-789",
// Metadata (optional) - any additional structured data
metadata: {
action: "purchase",
amount: 99.99,
currency: "USD",
},
// Source location (optional)
file: "src/handlers/purchase.ts",
line: 42,
// User info (optional) - all fields except id are optional
user: {
id: "user_123", // required if user is provided
display_name: "John Doe", // optional
email: "[email protected]", // optional
metadata: { plan: "pro" }, // optional
},
// HTTP request context (optional) - all fields are optional
request: {
method: "POST",
path: "/api/purchase",
status_code: 200,
duration_ms: 150,
ip: "192.168.1.1",
user_agent: "Mozilla/5.0...",
},
// Security classification (optional) - all fields are optional
security: {
event_type: "data_access", // "auth" | "access" | "rate_limit" | "data_access" | "admin"
outcome: "success", // "success" | "failure" | "denied"
resource: "orders/123",
action: "create",
},
// User feedback (optional)
user_feedback_score: 5,
user_feedback_text: "Great experience!",
});User Object
interface ExceptraUser {
id: string; // Required - unique user identifier
display_name?: string; // Optional
email?: string; // Optional
metadata?: Record<string, unknown>; // Optional
}Example:
await exceptra.info("User logged in", {
user: { id: "user_123" },
});
await exceptra.info("User updated profile", {
user: {
id: "user_123",
display_name: "John Doe",
email: "[email protected]",
metadata: { plan: "enterprise", company: "Acme Inc" },
},
});Request Object
interface ExceptraRequest {
method?: string; // Optional - "GET", "POST", etc.
path?: string; // Optional - "/api/users"
status_code?: number; // Optional - 200, 404, 500, etc.
duration_ms?: number; // Optional - request duration
ip?: string; // Optional - client IP
user_agent?: string; // Optional - browser/client user agent
}Example with Express/Next.js:
// In an API route handler
await exceptra.info("API request completed", {
request: {
method: req.method,
path: req.url,
status_code: 200,
duration_ms: Date.now() - startTime,
ip: req.headers["x-forwarded-for"]?.toString().split(",")[0] || req.socket.remoteAddress,
user_agent: req.headers["user-agent"],
},
});Security Object
interface ExceptraSecurity {
event_type?: "auth" | "access" | "rate_limit" | "data_access" | "admin";
outcome?: "success" | "failure" | "denied";
resource?: string; // Optional - resource being accessed
action?: string; // Optional - action being performed
}Example:
// Authentication event
await exceptra.info("User login attempt", {
security: {
event_type: "auth",
outcome: "success",
},
user: { id: "user_123" },
});
// Access denied
await exceptra.warn("Unauthorized access attempt", {
security: {
event_type: "access",
outcome: "denied",
resource: "admin/settings",
action: "read",
},
});
// Rate limiting
await exceptra.warn("Rate limit exceeded", {
security: {
event_type: "rate_limit",
outcome: "denied",
},
request: {
ip: "192.168.1.1",
path: "/api/expensive-operation",
},
});LLM/Agent Events
For logging LLM interactions, use the agent() method.
ExceptraLLM Object
interface ExceptraLLM {
model: string; // Required - "gpt-4", "claude-3-opus", etc.
provider: string; // Required - "openai", "anthropic", etc.
input_tokens?: number; // Optional - auto-estimated if not provided
output_tokens?: number; // Optional - auto-estimated if not provided
cost?: number; // Optional - auto-calculated if not provided
latency_ms?: number; // Optional
time_to_first_token_ms?: number; // Optional - for streaming
messages?: LLMMessage[]; // Optional - conversation history
output?: string; // Optional - final response text
prompt_preview?: string; // Optional - auto-generated from messages
output_preview?: string; // Optional - auto-generated from output
spans?: ExceptraSpan[]; // Optional - auto-extracted from tool calls
}Minimal Agent Event
Only model and provider are required:
await exceptra.agent("Chat completion", {
model: "gpt-4",
provider: "openai",
});With Messages
await exceptra.agent("Chat completion", {
model: "gpt-4",
provider: "openai",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "Hello" },
{ role: "assistant", content: "Hi there! How can I help?" },
],
});With Full Metrics
const startTime = Date.now();
const response = await openai.chat.completions.create({...});
await exceptra.agent("Chat completion", {
model: response.model,
provider: "openai",
input_tokens: response.usage?.prompt_tokens,
output_tokens: response.usage?.completion_tokens,
latency_ms: Date.now() - startTime,
messages: messages,
output: response.choices[0].message.content,
}, {
user: { id: currentUser.id },
trace_id: conversationId,
});With Tool Calls
Spans are auto-extracted from messages containing tool calls:
await exceptra.agent("Tool-assisted response", {
model: "gpt-4",
provider: "openai",
messages: [
{ role: "user", content: "What's the weather in London?" },
{
role: "assistant",
content: null,
tool_calls: [{
id: "call_1",
type: "function",
function: { name: "get_weather", arguments: '{"city":"London"}' },
}],
},
{ role: "tool", content: '{"temp":18}', tool_call_id: "call_1" },
{ role: "assistant", content: "It's 18°C in London." },
],
});Context Helpers
Chain context methods to attach default values to all events:
// Create a logger with user context
const userLogger = exceptra.withUser({ id: "user_123" });
await userLogger.info("Action 1"); // includes user
await userLogger.info("Action 2"); // includes user
// Chain multiple contexts
const logger = exceptra
.withUser({ id: "user_123" })
.withTrace("request-abc")
.withSession("session-xyz")
.withMetadata({ version: "1.2.3" });
await logger.info("Event"); // includes all context
await logger.agent("LLM call", { model: "gpt-4", provider: "openai" }); // includes all contextAvailable Context Methods
exceptra.withUser(user: ExceptraUser) // Attach user to all events
exceptra.withTrace(traceId: string) // Attach trace_id to all events
exceptra.withSession(sessionId: string) // Attach session_id to all events
exceptra.withMetadata(metadata: Record<...>) // Merge metadata into all eventsBatching
For high-volume applications, enable batching to reduce API calls:
const exceptra = new Exceptra({
apiKey: "...",
applicationId: "...",
batching: true,
batchSize: 50, // Flush when 50 events queued
batchInterval: 2000, // Or flush every 2 seconds
});
// Events are batched automatically
exceptra.info("Event 1");
exceptra.info("Event 2");
// Flush before shutdown
await exceptra.flush();TypeScript Types
All types are exported:
import type {
ExceptraConfig,
ExceptraEvent,
ExceptraResponse,
ExceptraUser,
ExceptraRequest,
ExceptraSecurity,
ExceptraLLM,
LLMMessage,
ToolCall,
ExceptraSpan,
SpanType,
LogLevel,
SimpleLogOptions,
AgentLogOptions,
} from "@mode-7/exceptra";What Exceptra Computes Automatically
If you don't provide these fields, Exceptra computes them server-side:
| Field | How it's computed |
|-------|-------------------|
| input_tokens | Estimated via tiktoken from messages |
| output_tokens | Estimated via tiktoken from output |
| cost | Calculated from provider pricing tables |
| prompt_preview | Extracted from first user message |
| output_preview | Extracted from output/last assistant message |
| spans | Auto-extracted from messages with tool calls |
| trace_id | Auto-generated UUID if not provided |
| Security flags | PII detection, injection detection on content |
This means you can send minimal payloads:
// Minimal - Exceptra computes tokens, cost, previews, spans
await exceptra.agent("Chat", {
model: "gpt-4",
provider: "openai",
messages: [...],
});License
MIT
