@raindrop-ai/langchain
v0.0.2
Published
Raindrop integration for LangChain
Keywords
Readme
@raindrop-ai/langchain
Raindrop integration for LangChain. Automatically captures LLM calls, tool usage, chains, retrievers, and agent actions via LangChain's callback system.
Installation
npm install @raindrop-ai/langchain @langchain/coreUsage
import { createRaindropLangChain } from "@raindrop-ai/langchain";
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage } from "@langchain/core/messages";
const raindrop = createRaindropLangChain({
writeKey: "rk_...",
userId: "user-123",
});
const model = new ChatOpenAI({ model: "gpt-4o" });
const result = await model.invoke(
[new HumanMessage("Hello!")],
{ callbacks: [raindrop.handler] },
);
await raindrop.flush();What gets captured
- LLM calls: model name, input, output, token usage
- Tool calls: tool name, input arguments, output
- Chains: execution spans with parent-child nesting
- Retrievers: query and document count
- Errors: captured with OTLP error status
Options
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| writeKey | string | - | Raindrop API write key (omit to disable telemetry) |
| endpoint | string | https://api.raindrop.ai/v1/ | API endpoint |
| userId | string | - | Associate all events with a user |
| convoId | string | - | Group events into a conversation |
| debug | boolean | false | Enable verbose logging |
| traceChains | boolean | true | Create spans for chain execution |
| traceRetrievers | boolean | true | Create spans for retriever calls |
| filterLangGraphInternals | boolean | true | Filter LangGraph-internal chain events and deduplicate LLM callbacks |
LangGraph Support
Works with LangGraph out of the box. The handler automatically:
- Filters LangGraph-internal chain events (graph executor,
__start__,__end__, channel nodes) - Deduplicates LLM callbacks that LangGraph fires multiple times with the same
runId
Pass the handler both at graph.invoke() and inside your LLM node. See examples/langchain-langgraph-basic/ for a full example.
Best practices:
- Pass callbacks to the model inside your node, not to
graph.invoke()— this avoids duplicate events from LangGraph's internal callback propagation - Create a new handler instance per request in server environments
LangSmith Coexistence
Raindrop and LangSmith can run simultaneously — both receive the same LangChain callbacks independently. Set LANGSMITH_TRACING=false to disable LangSmith if you only want Raindrop.
Testing
pnpm testTests use MSW to intercept HTTP requests — no real LLM calls are made.
