@osohq/langchain
v0.1.0
Published
Oso observability integration for LangChain agents
Downloads
191
Readme
@osohq/langchain
Oso observability integration for LangChain agents.
Callback handler that automatically captures and sends all LangChain agent events to Oso's observability platform for monitoring, debugging, and security analysis.
Installation
npm install @osohq/langchainOr with yarn:
yarn add @osohq/langchainQuick Start
import { OsoObservabilityCallback } from "@osohq/langchain";
import { createOpenAIToolsAgent, AgentExecutor } from "langchain/agents";
import { ChatOpenAI } from "@langchain/openai";
// Create the callback (reads OSO_AUTH_TOKEN from environment)
const callback = new OsoObservabilityCallback({
agentId: "my-support-agent",
});
// Add to your agent
const llm = new ChatOpenAI({ model: "gpt-4" });
const agent = await createOpenAIToolsAgent({ llm, tools, prompt });
const agentExecutor = new AgentExecutor({
agent,
tools,
callbacks: [callback],
});
// Use your agent - all events are automatically captured
const result = await agentExecutor.invoke({
input: "Hello, how can I help?",
});
// Clean up
await callback.close();Configuration
Environment Variables
Set these in your environment or .env file:
# Required: Your Oso authentication token
OSO_AUTH_TOKEN=your-token-here
# Optional: Custom Oso endpoint (defaults to https://cloud.osohq.com/api/events)
OSO_ENDPOINT=https://cloud.osohq.com/api/events
# Optional: Enable/disable observability (defaults to true)
OSO_OBSERVABILITY_ENABLED=trueConstructor Parameters
new OsoObservabilityCallback({
endpoint: "https://cloud.osohq.com/api/events", // Oso endpoint URL
authToken: "your-token", // Oso auth token
enabled: true, // Enable/disable sending events
sessionId: "unique-session-id", // Group related conversations
metadata: { userId: "123", env: "prod" }, // Custom metadata for all events
agentId: "my-agent", // Agent identifier
});All parameters are optional and fall back to environment variables or defaults.
What Gets Captured
The callback automatically captures all LangChain events:
LLM Events
- Model name and configuration
- Prompts sent to the LLM
- Generated responses
- Token usage (prompt, completion, total)
- Errors and failures
Tool Events
- Tool name and description
- Input parameters
- Output/results
- Execution duration (milliseconds)
- Errors and stack traces
Agent Events
- Agent reasoning and thought process
- Tool selection decisions
- Tool input parameters
- Final outputs
- Complete execution flow
Chain Events
- Chain type and name
- Input parameters
- Output values
- Nested chain execution
Execution Summary
At the end of each agent execution, a summary event is sent with:
- Total execution duration
- Number of LLM calls, tool calls, and agent steps
- Total token usage
- Error count
- Complete execution trace
Event Structure
Every event sent to Oso has this structure:
{
"event_type": "tool.completed",
"execution_id": "unique-execution-id",
"session_id": "conversation-session-id",
"timestamp": "2024-02-15T10:30:45.123Z",
"agent_id": "my-agent",
"data": {
/* event-specific data */
},
"metadata": {
/* your custom metadata */
}
}Event Types
llm.started/llm.completed/llm.errortool.started/tool.completed/tool.erroragent.action/agent.finishedchain.started/chain.completed/chain.errorexecution.summary- Final summary with all accumulated data
Error Handling
The callback is designed to fail gracefully:
- Network errors or timeouts won't crash your agent
- Failed event sends are logged but don't interrupt execution
- Comprehensive error logging for debugging
Logging
The callback uses Node.js console logging. You can configure it in your application or use a custom logger:
// Set NODE_DEBUG environment variable to see debug logs
process.env.NODE_DEBUG = "langchain-oso";Examples
Basic Agent with Tools
import { OsoObservabilityCallback } from "@osohq/langchain";
import { createOpenAIToolsAgent, AgentExecutor } from "langchain/agents";
import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
const searchOrders = tool(
async ({ customerId }: { customerId: string }) => {
return `Orders for ${customerId}: ORD001, ORD002`;
},
{
name: "search_orders",
description: "Search for customer orders.",
schema: z.object({
customerId: z.string().describe("The customer ID"),
}),
}
);
async function main() {
const callback = new OsoObservabilityCallback({ agentId: "support-agent" });
const llm = new ChatOpenAI({ model: "gpt-4o-mini" });
const tools = [searchOrders];
const agent = await createOpenAIToolsAgent({ llm, tools, prompt });
const agentExecutor = new AgentExecutor({
agent,
tools,
callbacks: [callback],
});
const result = await agentExecutor.invoke({
input: "Find orders for customer CUST001",
});
console.log(result.output);
await callback.close();
}With Custom Metadata
const callback = new OsoObservabilityCallback({
agentId: "support-agent",
sessionId: "user-session-123",
metadata: {
userId: "user-456",
environment: "production",
version: "1.2.3",
},
});Multiple Agents in Same Session
import { randomUUID } from "crypto";
const sessionId = randomUUID();
// Agent 1
const callback1 = new OsoObservabilityCallback({
agentId: "agent-1",
sessionId,
});
const result1 = await agent1Executor.invoke({ input: "..." });
await callback1.close();
// Agent 2 - same session
const callback2 = new OsoObservabilityCallback({
agentId: "agent-2",
sessionId,
});
const result2 = await agent2Executor.invoke({ input: "..." });
await callback2.close();TypeScript Support
This package is written in TypeScript and includes full type definitions. All types are exported for your convenience.
Requirements
- Node.js 16.0.0 or higher
- @langchain/core >= 0.1.0
License
Apache License 2.0
Support
- Documentation: https://www.osohq.com/docs
- Website: https://www.osohq.com
