@triage-sec/sdk
v0.0.1
Published
Triage SDK — security-focused observability for AI agents
Downloads
174
Readme
@triage-sec/sdk
Security-focused observability for AI agents. Captures telemetry from LLM calls automatically using OpenLLMetry and sends traces to the Triage backend via standard OTLP/HTTP.
Install
npm install @triage-sec/sdk
# or
pnpm add @triage-sec/sdkRequires Node.js >= 18.
Quick start
import * as triage from "@triage-sec/sdk";
import OpenAI from "openai";
// Initialize once at app startup
triage.init({
apiKey: "tsk_...",
appName: "my-chatbot",
environment: "production",
});
// Annotate application context before LLM calls
triage.setUser("user_123", "admin");
triage.setTenant("org_456");
triage.setSession("session_789", 1);
// Use any LLM provider normally — telemetry is automatic
const client = new OpenAI();
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello" }],
});OpenLLMetry auto-instruments OpenAI, Anthropic, Cohere, Mistral, Groq, Bedrock, Vertex AI, LangChain, LlamaIndex, CrewAI, Pinecone, ChromaDB, and 20+ other providers — zero provider-specific code needed.
Configuration
triage.init({
apiKey: "tsk_...", // or TRIAGE_API_KEY env var
endpoint: "https://...", // or TRIAGE_ENDPOINT (default: https://api.triageai.dev)
appName: "my-app", // or TRIAGE_APP_NAME
environment: "production",// or TRIAGE_ENVIRONMENT (default: "development")
enabled: true, // or TRIAGE_ENABLED (default: true)
traceContent: true, // or TRIAGE_TRACE_CONTENT — capture prompts/completions (default: true)
});All options can be set via environment variables. Explicit arguments take precedence over env vars.
Context helpers
Six helpers annotate traces with application-layer context that OpenLLMetry can't see:
// User identity (from your auth layer)
triage.setUser(userId: string, role?: string)
// Multi-tenancy
triage.setTenant(tenantId: string, name?: string)
// Conversation tracking
triage.setSession(sessionId: string, turnNumber?: number, historyHash?: string)
// Input before/after sanitization
triage.setInput(raw: string, sanitized?: string)
// Prompt template tracking
triage.setTemplate(templateId: string, version?: string)
// RAG chunk access control
triage.setChunkAcls(acls: Record<string, unknown>[])Scoped context
Use withContext to scope annotations to a callback:
triage.withContext({ userId: "user_123", tenantId: "org_456" }, async () => {
// All LLM calls here get these annotations
await client.chat.completions.create({ ... });
});
// Context is restored after the callbackShutdown
// Flush pending traces before process exit
await triage.shutdown();License
See LICENSE.
