imean-service-engine-plugin-langsmith-trace
v1.0.0
Published
LangSmith tracing plugin for imean-service-engine
Downloads
4
Maintainers
Readme
imean-service-engine-plugin-langsmith-trace
LangSmith tracing plugin for imean-service-engine.
This plugin provides seamless integration with LangSmith for distributed tracing and observability of your microservices.
Features
- 🔍 Automatic Tracing: Trace method calls with a simple decorator
- 🌐 Distributed Tracing: Propagate trace context across service boundaries
- ⚙️ Configurable: Module-level and method-level configuration
- 🚀 Zero-Config: Works out of the box with sensible defaults
- 📊 Rich Context: Captures method arguments and results
Installation
pnpm add imean-service-engine-plugin-langsmith-trace langsmithPrerequisites:
imean-service-engine>= 2.5.0langsmith>= 0.4.10- Node.js >= 18.0.0
Quick Start
1. Configure LangSmith
Set the required environment variables:
export LANGSMITH_API_KEY="your-api-key"
export LANGSMITH_TRACING="true"
export LANGSMITH_PROJECT="your-project-name"2. Register the Plugin
import { Factory } from "imean-service-engine";
import { LangsmithPlugin } from "imean-service-engine-plugin-langsmith-trace";
const engine = Factory.create({
name: "my-service",
version: "1.0.0",
plugins: [new LangsmithPlugin()],
});3. Use the @Traceable Decorator
import { Traceable } from "imean-service-engine-plugin-langsmith-trace";
@engine.Module("UserService", { tracePrefix: "MyApp" })
class UserService {
@Traceable()
async getUser(userId: string) {
// Your logic here
return { id: userId, name: "John" };
}
@Traceable({ name: "fetch-user-profile" })
async getUserProfile(userId: string) {
// Custom trace name
return { userId, profile: "..." };
}
}
await engine.start();API Reference
LangsmithPlugin
The main plugin class that integrates with imean-service-engine.
const plugin = new LangsmithPlugin();@Traceable Decorator
Marks a method for automatic tracing.
@Traceable(options?: TraceableOptions)Options
interface TraceableOptions {
/** Custom trace name (default: ${tracePrefix}.${methodName}) */
name?: string;
/** Enable/disable tracing for this method (default: true) */
enabled?: boolean;
/** Additional RunTreeConfig options from LangSmith */
[key: string]: any;
}Module Configuration
Configure tracing at the module level:
@engine.Module("MyModule", {
tracePrefix: "MyApp", // Prefix for all traces in this module
traceEnabled: true, // Enable/disable tracing for entire module
})
class MyModule {
// ...
}traceFetch Utility
Automatically inject trace context into HTTP requests:
import { traceFetch } from "imean-service-engine-plugin-langsmith-trace";
// Wrap the global fetch
globalThis.fetch = traceFetch(globalThis.fetch);
// All fetch calls now propagate trace context
const response = await fetch("https://api.example.com/data");Usage Examples
Basic Tracing
@engine.Module("OrderService")
class OrderService {
@Traceable()
async createOrder(order: Order) {
return await this.db.orders.create(order);
}
}Custom Trace Names
@Traceable({ name: "process-payment-with-stripe" })
async processPayment(amount: number) {
return await stripe.charges.create({ amount });
}Conditional Tracing
@Traceable({ enabled: process.env.NODE_ENV === "production" })
async debugMethod() {
// Only traced in production
}Module-Level Configuration
@engine.Module("AIService", {
tracePrefix: "AI",
traceEnabled: true,
})
class AIService {
@Traceable() // Trace name: "AI.generateResponse"
async generateResponse(prompt: string) {
return await openai.chat.completions.create({ prompt });
}
@Traceable({ name: "embed-text" }) // Trace name: "embed-text"
async embedText(text: string) {
return await openai.embeddings.create({ input: text });
}
}Cross-Service Tracing
import { traceFetch } from "imean-service-engine-plugin-langsmith-trace";
// Service A
globalThis.fetch = traceFetch(globalThis.fetch);
@engine.Module("ServiceA")
class ServiceA {
@Traceable()
async callServiceB() {
// Trace context is automatically propagated
const response = await fetch("http://service-b/api/data");
return response.json();
}
}
// Service B (with LangsmithPlugin)
// Automatically picks up trace context from headers
@engine.Module("ServiceB")
class ServiceB {
@Traceable()
async getData() {
return { data: "..." };
}
}How It Works
- Middleware Integration: The plugin registers a Hono middleware that extracts LangSmith trace context from incoming HTTP headers
- Method Wrapping: Methods decorated with
@Traceableare automatically wrapped to create trace spans - Context Propagation: Trace context is stored in async local storage and automatically propagated to child operations
- Header Injection: The
traceFetchutility injects trace headers into outgoing HTTP requests
Best Practices
Use Descriptive Trace Names: Help identify operations in your trace viewer
@Traceable({ name: "fetch-user-recommendations-ml" })Set Module Prefixes: Organize traces by service/module
@engine.Module("RecommendationEngine", { tracePrefix: "ML.Recommendations" })Trace at the Right Level: Don't trace every tiny function - focus on meaningful operations
Use Environment Variables: Control tracing behavior without code changes
LANGSMITH_TRACE_PREFIX="MyApp"Wrap fetch Early: Apply
traceFetchat application startupglobalThis.fetch = traceFetch(globalThis.fetch); await engine.start();
Troubleshooting
Traces Not Appearing in LangSmith
Verify environment variables are set:
echo $LANGSMITH_API_KEY echo $LANGSMITH_TRACINGCheck that the plugin is registered before starting the engine
Ensure HTTP requests include trace headers (use
traceFetch)
Type Errors
Make sure you have the correct peer dependencies installed:
pnpm add imean-service-engine@^2.5.0 langsmith@^0.4.10License
MIT
Contributing
Contributions are welcome! Please open an issue or submit a pull request.
