crucible-ai-sdk
v1.0.0
Published
A high-performance Node.js SDK for logging and monitoring OpenAI API calls to Crucible
Downloads
2
Maintainers
Readme
Crucible SDK for Node.js
A high-performance Node.js SDK for logging and monitoring OpenAI and Anthropic API calls to Crucible warehouse.
Features
- Seamless Integration: Drop-in replacement for OpenAI's and Anthropic's Node.js clients
- Background Logging: Non-blocking, batched logging with configurable intervals
- Streaming Support: Efficient handling of streaming responses with memory optimization
- Error Resilience: Logging failures never break your application
- Performance Optimized: <1ms overhead per request, <50MB memory usage
- Async Support: Full async/await support for high-concurrency applications
- Rich Tagging: Flexible metadata system for organizing and filtering logs
- Circuit Breaker: Automatic failure detection and recovery
- Compression: Optional request/response compression for network efficiency
- TypeScript Support: Full TypeScript support with comprehensive type definitions
- LangChain Integration: Drop-in replacement for LangChain's ChatOpenAI and ChatAnthropic
- Vercel AI SDK Integration: Drop-in replacement for Vercel AI SDK functions with automatic Crucible tracing
Installation
npm install crucible-ai-sdk
# or
yarn add crucible-ai-sdkQuick Start
Basic Usage (OpenAI)
import { CrucibleOpenAI } from 'crucible-ai-sdk';
// Initialize client (uses warehouse.usecrucible.ai by default)
const client = new CrucibleOpenAI('your-crucible-api-key');
// Make API calls (automatically logged)
const response = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello, world!' }],
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456',
userId: 'user_789'
}
});
console.log(response.choices[0].message.content);
// Clean up
await client.close();Basic Usage (Anthropic)
import { CrucibleAnthropic } from 'crucible-ai-sdk';
// Initialize Anthropic client
const client = new CrucibleAnthropic('your-crucible-api-key');
// Make API calls (automatically logged)
const response = await client.messages.create({
model: 'claude-3-sonnet-20240229',
messages: [{ role: 'user', content: 'Hello, world!' }],
maxTokens: 100,
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456',
userId: 'user_789'
}
});
console.log(response.content[0].text);
// Clean up
await client.close();Async Usage (OpenAI)
import { CrucibleAsyncOpenAI } from 'crucible-ai-sdk';
async function main() {
const client = new CrucibleAsyncOpenAI('your-crucible-api-key');
const response = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello, async world!' }],
crucibleMetadata: {
threadId: 'async_thread_123',
taskId: 'async_task_456'
}
});
console.log(response.choices[0].message.content);
await client.close();
}
main();Async Usage (Anthropic)
import { CrucibleAsyncAnthropic } from 'crucible-ai-sdk';
async function main() {
const client = new CrucibleAsyncAnthropic('your-crucible-api-key');
const response = await client.messages.create({
model: 'claude-3-sonnet-20240229',
messages: [{ role: 'user', content: 'Hello, async world!' }],
maxTokens: 100,
crucibleMetadata: {
threadId: 'async_thread_123',
taskId: 'async_task_456'
}
});
console.log(response.content[0].text);
await client.close();
}
main();Streaming Support (OpenAI)
import { CrucibleOpenAI } from 'crucible-ai-sdk';
const client = new CrucibleOpenAI('your-crucible-api-key');
// Streaming responses are automatically logged
const stream = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Count from 1 to 5' }],
stream: true,
crucibleMetadata: {
sessionId: 'session_abc',
experiment: 'streaming_test'
}
});
for await (const chunk of stream) {
if (chunk.choices[0].delta.content) {
process.stdout.write(chunk.choices[0].delta.content);
}
}Streaming Support (Anthropic)
import { CrucibleAnthropic } from 'crucible-ai-sdk';
const client = new CrucibleAnthropic('your-crucible-api-key');
// Streaming responses are automatically logged
const stream = await client.messages.create({
model: 'claude-3-sonnet-20240229',
messages: [{ role: 'user', content: 'Count from 1 to 5' }],
maxTokens: 100,
stream: true,
crucibleMetadata: {
sessionId: 'session_abc',
experiment: 'streaming_test'
}
});
for await (const chunk of stream) {
if (chunk.type === 'content_block_delta') {
process.stdout.write(chunk.delta.text);
}
}Custom Domain
import { CrucibleOpenAI } from 'crucible-ai-sdk';
// Use custom domain
const client = new CrucibleOpenAI('your-crucible-api-key', 'custom.warehouse.com');
// Make API call with metadata
const response = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello from custom domain!' }],
crucibleMetadata: {
domain: 'custom',
environment: 'production'
}
});Context Manager
import { CrucibleOpenAI } from 'crucible-ai-sdk';
// Automatic cleanup with context manager
async function example() {
const client = new CrucibleOpenAI('your-crucible-api-key');
try {
const response = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello!' }],
crucibleMetadata: {
contextTest: true,
sessionId: 'context_session'
}
});
console.log(response.choices[0].message.content);
// Logs are automatically flushed when exiting context
} finally {
await client.close();
}
}LangChain Integration
Crucible provides drop-in replacements for LangChain's ChatOpenAI and ChatAnthropic:
import { ChatOpenAI, ChatAnthropic } from 'crucible-ai-sdk/langchain';
// OpenAI with LangChain interface
const openaiClient = new ChatOpenAI({
openaiApiKey: 'your-openai-key',
crucibleDomain: 'your-crucible-domain',
model: 'gpt-3.5-turbo',
maxTokens: 100,
});
// Anthropic with LangChain interface
const anthropicClient = new ChatAnthropic({
anthropicApiKey: 'your-anthropic-key',
crucibleDomain: 'your-crucible-domain',
model: 'claude-3-sonnet-20240229',
maxTokens: 100,
});
// Use exactly like LangChain's ChatOpenAI/ChatAnthropic
const response = await openaiClient.invoke([
{ role: 'user', content: 'Hello from LangChain!' }
], {
crucibleMetadata: {
threadId: 'langchain_thread',
taskId: 'langchain_task'
}
});
console.log(response.generations[0].text);Vercel AI SDK Integration
Crucible provides drop-in replacements for Vercel AI SDK functions with automatic Crucible tracing:
import { wrapAISDK } from 'crucible-ai-sdk/vercel-ai-sdk';
import * as ai from 'ai';
import { openai } from '@ai-sdk/openai';
import { anthropic } from '@ai-sdk/anthropic';
import { z } from 'zod';
// Wrap AI SDK with Crucible tracing
const { generateText, generateObject, streamText, streamObject } = wrapAISDK(ai, {
apiKey: 'your-crucible-api-key',
domain: 'your-crucible-domain',
});
// Generate text with OpenAI model
const openaiResponse = await generateText({
model: openai('gpt-3.5-turbo'),
prompt: 'What is the capital of France?',
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456',
}
});
console.log(openaiResponse.text);
// Generate structured object with Anthropic model
const anthropicResponse = await generateObject({
model: anthropic('claude-3-sonnet-20240229'),
prompt: 'Generate a recipe for lasagna',
schema: z.object({
name: z.string(),
ingredients: z.array(z.string()),
steps: z.array(z.string())
}),
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456',
}
});
console.log(anthropicResponse.object);
// Stream text responses
const streamResult = await streamText({
model: openai('gpt-3.5-turbo'),
prompt: 'Write a short poem about AI',
crucibleMetadata: {
sessionId: 'session_abc',
experiment: 'streaming_test'
}
});
for await (const textPart of streamResult.textStream) {
process.stdout.write(textPart);
}Middleware Integration
You can also use Crucible middleware directly with Vercel AI SDK's wrapLanguageModel:
import { wrapLanguageModel } from 'ai';
import { openai } from '@ai-sdk/openai';
import { CrucibleMiddleware } from 'crucible-ai-sdk/vercel-ai-sdk';
// Wrap model with Crucible middleware
const model = wrapLanguageModel({
model: openai('gpt-4'),
middleware: CrucibleMiddleware({
apiKey: 'your-crucible-api-key',
domain: 'your-crucible-domain'
})
});
// Use the wrapped model
const result = await generateText({
model,
prompt: 'Hello world',
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456'
}
});Configuration
Environment Variables
export CRUCIBLE_API_KEY="your-api-key"
export CRUCIBLE_DOMAIN="warehouse.usecrucible.ai" # Optional, defaults to warehouse.usecrucible.ai
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"Configuration Options
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| apiKey | string | None | Crucible API key |
| domain | string | "warehouse.usecrucible.ai" | Crucible warehouse domain |
| batchSize | number | 10 | Number of requests to batch together |
| flushInterval | number | 5.0 | Seconds between batch flushes |
| maxRetries | number | 3 | Number of retry attempts |
| timeout | number | 30.0 | Request timeout in seconds |
| enableLogging | boolean | true | Enable/disable logging |
| enableCompression | boolean | true | Enable request compression |
| maxMemoryMb | number | 50 | Maximum memory usage in MB |
| maxQueueSize | number | 1000 | Maximum queue size for background logging |
Metadata System
Use metadata to organize and filter your logged requests:
client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'What is the capital of France?' }],
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456',
userId: 'user_789',
sessionId: 'session_abc',
questionType: 'geography',
difficulty: 'easy',
experiment: 'knowledge_test'
}
});Error Handling
Crucible SDK is designed to be resilient. Logging failures never break your application:
import { CrucibleOpenAI } from 'crucible-ai-sdk';
const client = new CrucibleOpenAI('your-crucible-api-key');
try {
// This will fail, but error will be logged
const response = await client.chat.completions.create({
model: 'gpt-3.5-turbo-invalid',
messages: [{ role: 'user', content: 'This will fail' }],
crucibleMetadata: {
errorTest: true,
taskId: 'error_test_123'
}
});
} catch (error) {
console.log(`API call failed: ${error.message}`);
// Error was automatically logged to Crucible warehouse
}Performance Monitoring
Monitor the performance of your logging system:
import { CrucibleOpenAI } from 'crucible-ai-sdk';
const client = new CrucibleOpenAI('your-crucible-api-key');
// Make some API calls...
const response = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello!' }],
crucibleMetadata: { test: 'performance' }
});
// Get logging statistics
const stats = client.getLoggingStats();
console.log('Logging stats:', stats);
// Clean up
await client.close();Advanced Usage
Manual Logging
import { CrucibleLogger, LogRequest, CrucibleConfig } from 'crucible-ai-sdk';
const config = new CrucibleConfig({
apiKey: 'your-api-key',
domain: 'warehouse.usecrucible.ai'
});
const logger = new CrucibleLogger(config);
// Create log request manually
const logRequest: LogRequest = {
requestedAt: Date.now(),
receivedAt: Date.now(),
reqPayload: { model: 'gpt-3.5-turbo', messages: [] },
respPayload: { choices: [] },
statusCode: 200,
metadata: { manual: 'true', taskId: 'manual_123' },
tags: {},
sdkVersion: '0.1.0',
sdkName: 'node'
};
// Log manually
logger.logRequest(logRequest);
// Clean up
logger.close();Streaming Statistics
import { StreamingMerger } from 'crucible-ai-sdk';
const merger = new StreamingMerger(100); // 100MB limit
// Process chunks...
for await (const chunk of stream) {
assembled = merger.mergeChunk(assembled, chunk);
}
// Get streaming statistics
const stats = merger.getStats();
console.log(`Memory usage: ${stats.currentSize} bytes`);
console.log(`Chunks processed: ${stats.totalChunksProcessed}`);Development
Running Tests
npm test
# or
yarn testRunning Examples
# Set environment variables
export OPENAI_API_KEY="your-openai-key"
export CRUCIBLE_API_KEY="your-crucible-key"
# Run examples
npm run example:basic
npm run example:async
npm run example:streaming
npm run example:anthropic
npm run example:anthropic-async
npm run example:anthropic-langchain
npm run example:vercel-aiAPI Reference
CrucibleOpenAI
Main client class for synchronous operations.
Methods
chat.completions.create(options): Create chat completion with loggingclose(): Close client and flush logsflushLogs(): Force flush pending logsgetLoggingStats(): Get logging statisticsisHealthy(): Check if logger is healthy
CrucibleAsyncOpenAI
Async client class for asynchronous operations.
Methods
chat.completions.create(options): Create async chat completion with loggingclose(): Close client and flush logsflushLogs(): Force flush pending logsgetLoggingStats(): Get logging statisticsisHealthy(): Check if logger is healthy
CrucibleAnthropic
Main client class for Anthropic synchronous operations.
Methods
messages.create(options): Create Anthropic message completion with loggingclose(): Close client and flush logsflushLogs(): Force flush pending logsgetLoggingStats(): Get logging statisticsisHealthy(): Check if logger is healthy
CrucibleAsyncAnthropic
Async client class for Anthropic asynchronous operations.
Methods
messages.create(options): Create async Anthropic message completion with loggingclose(): Close client and flush logsflushLogs(): Force flush pending logsgetLoggingStats(): Get logging statisticsisHealthy(): Check if logger is healthy
ChatOpenAI (LangChain)
Drop-in replacement for LangChain's ChatOpenAI with automatic Crucible logging.
Methods
invoke(input, config): LangChain-compatible invoke methodainvoke(input, config): LangChain-compatible async invoke methodstream(input, config): LangChain-compatible stream methodwithMetadata(metadata): Add metadata to requestsclose(): Close client and flush logsflush(): Force flush pending logsgetLoggingStats(): Get logging statisticsisHealthy(): Check if client is healthy
ChatAnthropic (LangChain)
Drop-in replacement for LangChain's ChatAnthropic with automatic Crucible logging.
Methods
invoke(input, config): LangChain-compatible invoke methodainvoke(input, config): LangChain-compatible async invoke methodstream(input, config): LangChain-compatible stream methodwithMetadata(metadata): Add metadata to requestsclose(): Close client and flush logsflush(): Force flush pending logsgetLoggingStats(): Get logging statisticsisHealthy(): Check if client is healthy
wrapAISDK
Wraps Vercel AI SDK methods with Crucible tracing.
Parameters
ai: The AI SDK namespace (e.g.,import * as ai from "ai")config: Crucible configuration options
Returns
Object with AI SDK methods (generateText, streamText, generateObject, streamObject) with Crucible tracing.
Example
import { wrapAISDK } from 'crucible-ai-sdk/vercel-ai-sdk';
import * as ai from 'ai';
const { generateText, streamText, generateObject, streamObject } = wrapAISDK(ai, {
apiKey: 'your-api-key',
domain: 'your-domain'
});CrucibleMiddleware
Creates a Crucible middleware for AI SDK v5 that automatically traces generateText and streamText calls.
Parameters
config: Configuration options for the middleware
Returns
A middleware object compatible with AI SDK v5's wrapLanguageModel.
Example
import { CrucibleMiddleware } from 'crucible-ai-sdk/vercel-ai-sdk';
const middleware = CrucibleMiddleware({
apiKey: 'your-api-key',
domain: 'your-domain'
});CrucibleConfig
Configuration class for Crucible SDK.
Properties
apiKey: Crucible API keydomain: Crucible warehouse domain (defaults to warehouse.usecrucible.ai)batchSize: Batch size for loggingflushInterval: Flush interval in secondsenableLogging: Enable/disable loggingenableCompression: Enable/disable compression
TypeScript Support
This SDK is written in TypeScript and provides comprehensive type definitions:
import { CrucibleOpenAI, CrucibleAnthropic, CrucibleMetadata, OpenAIRequestOptions, AnthropicRequestOptions } from 'crucible-ai-sdk';
import { wrapAISDK, CrucibleMiddleware } from 'crucible-ai-sdk/vercel-ai-sdk';
// OpenAI client
const openaiClient = new CrucibleOpenAI('your-api-key');
const openaiOptions: OpenAIRequestOptions = {
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello!' }],
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456'
} as CrucibleMetadata
};
const openaiResponse = await openaiClient.chat.completions.create(openaiOptions);
// Anthropic client
const anthropicClient = new CrucibleAnthropic('your-api-key');
const anthropicOptions: AnthropicRequestOptions = {
model: 'claude-3-sonnet-20240229',
messages: [{ role: 'user', content: 'Hello!' }],
maxTokens: 100,
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456'
} as CrucibleMetadata
};
const anthropicResponse = await anthropicClient.messages.create(anthropicOptions);
// Vercel AI SDK integration
import * as ai from 'ai';
import { openai } from '@ai-sdk/openai';
const { generateText } = wrapAISDK(ai, {
apiKey: 'your-api-key',
domain: 'your-domain'
});
const aiSdkResponse = await generateText({
model: openai('gpt-3.5-turbo'),
prompt: 'Hello from Vercel AI SDK!',
crucibleMetadata: {
threadId: 'thread_123',
taskId: 'task_456'
} as CrucibleMetadata
});License
MIT License - see LICENSE file for details.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
Support
- Documentation: https://docs.crucible.ai
- Issues: https://github.com/crucible/crucible-node-sdk/issues
- Email: [email protected]
