@raindrop-ai/ai-sdk
v0.0.2
Published
Standalone Vercel AI SDK integration for Raindrop (events + OTLP/HTTP JSON traces, no OTEL runtime)
Readme
@raindrop-ai/ai-sdk
Standalone Vercel AI SDK integration for Raindrop:
- Events: sends a
track_partialpayload toPOST /v1/events/track_partialwhen the model finishes - Standalone traces: ships spans directly to
POST /v1/tracesas OTLP/HTTP JSON - No OpenTelemetry SDK init: avoids global OTEL registration conflicts
Install
yarn add @raindrop-ai/ai-sdkUsage
import * as ai from "ai";
import { createRaindropAISDK } from "@raindrop-ai/ai-sdk";
const raindrop = createRaindropAISDK({
writeKey: process.env.RAINDROP_WRITE_KEY!,
});
const { generateText } = raindrop.wrap(ai, {
context: { userId: "user_123", convoId: "convo_456", eventName: "chat_message" },
// optional: full control over event input/output, metadata, and attachments
buildEvent: (messages) => {
const lastUser = [...messages].reverse().find((m) => m.role === "user");
const lastAssistant = [...messages].reverse().find((m) => m.role === "assistant");
return {
input: typeof lastUser?.content === "string" ? lastUser.content : undefined,
output: typeof lastAssistant?.content === "string" ? lastAssistant.content : undefined,
};
},
});
const result = await generateText({
model: /* your AI SDK model */,
prompt: "Hello!",
});
// Identify a user (optional)
await raindrop.users.identify({
userId: "user_123",
traits: { plan: "pro" },
});
await raindrop.flush();Runtime support
Node.js (recommended)
Use the default import:
import { createRaindropAISDK } from "@raindrop-ai/ai-sdk";Cloudflare Workers
Cloudflare Workers can provide AsyncLocalStorage via node:async_hooks when nodejs_compat is enabled (docs).
Use the Workers entrypoint:
import { createRaindropAISDK } from "@raindrop-ai/ai-sdk/workers";If nodejs_compat is not enabled, AsyncLocalStorage-based context propagation cannot work.
Supported AI SDK Versions
This package is tested against multiple Vercel AI SDK versions:
| Version | Status | |---------|--------| | v4.x | ✅ Supported | | v5.x | ✅ Supported | | v6.x | ✅ Supported |
Version Differences Handled
| Feature | v4 | v5 | v6 |
|---------|----|----|-----|
| finishReason | String ("stop") | String ("stop") | Object ({ unified: "stop" }) |
| usage tokens | promptTokens/completionTokens | inputTokens/outputTokens | inputTokens/outputTokens |
| Output.object().responseFormat | N/A | Plain object | Promise |
Testing
Tests are organized to verify compatibility across AI SDK versions:
packages/ai-sdk/
├── tests/
│ ├── v4/ # AI SDK v4 (pins ai@^4.1.17)
│ │ ├── ai-sdk.v4.test.ts
│ │ ├── wrapper.test.ts
│ │ └── http-payloads.test.ts
│ ├── v5/ # AI SDK v5 (pins ai@^5.0.0)
│ │ ├── ai-sdk.v5.test.ts
│ │ ├── wrapper.test.ts
│ │ └── http-payloads.test.ts
│ └── v6/ # AI SDK v6 (pins ai@^6.0.0)
│ ├── ai-sdk.v6.test.ts
│ ├── wrapper.test.ts
│ └── http-payloads.test.tsRunning Tests
# Run all version tests (requires OPENAI_API_KEY and RAINDROP_WRITE_KEY in .env)
yarn test
# Run specific version
yarn test:v4
yarn test:v5
yarn test:v6
# Quick smoke test (real LLM calls, single version)
yarn smoke:minTest Coverage
Each version runs:
- Wrapper tests - API shape, wrapper creation, tools passthrough
- HTTP payload tests - MSW-based payload validation for each spec version
- Version-specific tests - API differences (finishReason format, usage naming)
Notes
- Spans include
ai.telemetry.metadata.raindrop.eventIdfor correlation, and omitai.telemetry.metadata.raindrop.userIdto prevent duplicate span→event creation server-side.
