march-ai-sdk
v0.5.1
Published
TypeScript SDK for building AI agents in the March AI platform
Maintainers
Readme
March Agent SDK (TypeScript)
TypeScript SDK for building AI agents in the March AI platform. Provides agent registration, message handling via Kafka (through Agent Gateway), streaming responses, and integrations with LangGraph and Vercel AI SDK.
Quick Start
The fastest way to get started is with the CLI:
npx march-start my-agent
cd my-agent
# Edit .env with your configuration
pnpm devOverview
The March Agent SDK (march-ai-sdk) enables developers to build AI agents that integrate with the March AI Management platform. Agents register themselves, receive messages through Kafka topics, and stream responses back to users.
Key Features
- Agent Registration: Automatic registration with AI Inventory
- Message Handling: Function-based handlers with sender filtering
- Response Streaming: Chunked responses via Kafka/Centrifugo
- Conversation History: Access to message history via Conversation Store
- LangGraph Integration: HTTP-based checkpoint saver for LangGraph graphs
- Vercel AI SDK Integration: Persistent message history for Vercel AI SDK
- Attachment Handling: Download images and PDFs for processing
- Long-term Memory: Semantic search across user conversations
- Error Handling: Automatic error responses to users on handler failures
Architecture
┌─────────────────────────────────────────────────────────────────────────────┐
│ Your Agent Application │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ MarchAgentApp │
│ ├── registerMe() → Agent │
│ │ ├── onMessage() handler │
│ │ └── streamer() → Streamer │
│ └── run() → Start consume loop │
│ │
│ Extensions: │
│ ├── HTTPCheckpointSaver (LangGraph) │
│ └── VercelAIMessageStore (Vercel AI SDK) │
│ │
└──────────────────────────────────┬──────────────────────────────────────────┘
│ gRPC (Kafka) + HTTP (APIs)
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ Agent Gateway │
├─────────────────────────────────────────────────────────────────────────────┤
│ gRPC: AgentStream (bidirectional) HTTP: /s/{service}/* proxy │
│ - Auth - /s/ai-inventory/* │
│ - Subscribe/Unsubscribe - /s/conversation-store/* │
│ - Produce/Consume messages - /s/ai-memory/* │
│ - /s/attachment/* │
└─────────────────┬───────────────────────────────────┬───────────────────────┘
│ │
▼ ▼
┌──────────────┐ ┌────────────────────┐
│ Kafka │ │ AI Inventory / │
│ {agent}.inbox │ │ Conversation Store │
└──────────────┘ └────────────────────┘Message Flow
sequenceDiagram
participant User
participant Provider
participant Router
participant Gateway as Agent Gateway
participant Agent as Your Agent
participant ConvStore as Conversation Store
User->>Provider: Send message
Provider->>Router: Kafka: router.inbox
Router->>Gateway: Kafka: {agent}.inbox
Gateway->>Agent: gRPC stream: message
Agent->>ConvStore: Get conversation history
ConvStore-->>Agent: Message history
loop Streaming Response
Agent->>Gateway: gRPC stream: content chunk
Gateway->>Router: Kafka: router.inbox
Router->>Provider: Kafka: user.inbox
Provider->>User: WebSocket: chunk
end
Agent->>Gateway: gRPC stream: done=trueInstallation
# Using pnpm
pnpm add march-ai-sdk
# Using npm
npm install march-ai-sdk
# Using yarn
yarn add march-ai-sdkOr scaffold a new project with the CLI:
npx march-start my-agentPeer Dependencies
The SDK requires the following peer dependencies:
pnpm add @grpc/grpc-js @grpc/proto-loader zodOptional Extensions
# For LangGraph support
pnpm add @langchain/langgraph @langchain/langgraph-checkpoint @langchain/core
# For Vercel AI SDK support
pnpm add ai @ai-sdk/openaiQuick Start
import { config } from 'dotenv'
import { MarchAgentApp, Message } from 'march-ai-sdk'
// Load environment variables
config({ path: '.env', override: true })
// Initialize the app
const app = new MarchAgentApp({
gatewayUrl: process.env.GATEWAY_URL || 'agent-gateway:8080',
apiKey: process.env.GATEWAY_API_KEY || 'your-api-key',
})
async function main() {
// Register an agent
const agent = await app.registerMe({
name: 'my-assistant',
about: 'A helpful AI assistant',
document: 'Answers general questions and provides helpful information.',
representationName: 'My Assistant',
})
// Handle messages
agent.onMessage(async (message: Message, sender: string) => {
// Access conversation history
const history = await message.conversation?.getHistory({ limit: 10 })
// Stream response
const streamer = agent.streamer(message)
streamer.stream('Hello! ')
streamer.stream('How can I help you today?')
await streamer.finish()
})
// Run the agent
await app.run()
}
main().catch(console.error)Configuration
Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| GATEWAY_URL | Agent Gateway endpoint | Required |
| GATEWAY_API_KEY | API key for authentication | Required |
| HEARTBEAT_INTERVAL | Heartbeat frequency (seconds) | 60 |
| CONNECTION_SECURE | Use TLS for connections | false |
MarchAgentApp Options
interface AppOptions {
gatewayUrl: string // Gateway endpoint (host:port)
apiKey: string // Authentication key
heartbeatInterval?: number // Heartbeat frequency in seconds (default: 60)
maxConcurrentTasks?: number // Max concurrent message handlers (default: 100)
errorMessageTemplate?: string // Error message sent to users
secure?: boolean // Use TLS for gRPC and HTTPS for HTTP (default: false)
}
const app = new MarchAgentApp({
gatewayUrl: 'agent-gateway:8080',
apiKey: 'your-api-key',
heartbeatInterval: 60,
secure: false,
errorMessageTemplate: 'Sorry, something went wrong. Please try again.',
})API Reference
MarchAgentApp
Main application class for initializing and running agents.
import { MarchAgentApp } from 'march-ai-sdk'
const app = new MarchAgentApp({
gatewayUrl: 'agent-gateway:8080',
apiKey: 'your-api-key',
})
// Register an agent
const agent = await app.registerMe({
name: 'agent-name', // Unique identifier (used for routing)
about: 'Short description', // Brief description for agent selection
document: 'Full docs...', // Detailed documentation
representationName: 'Name', // Display name (optional)
baseUrl: 'http://...', // Base URL for artifacts (optional)
metadata: { key: 'value' }, // Additional metadata (optional)
relatedPages: [ // Related pages (optional)
{ name: 'Dashboard', endpoint: '/dashboard' }
],
})
// Start consuming messages (blocks until shutdown)
await app.run()Agent
Handles message registration and response streaming.
// Register message handler (all senders)
agent.onMessage(async (message, sender) => {
// Handle message
})
// Filter by sender
agent.onMessage(async (message, sender) => {
// Only handles messages from 'user'
}, { senders: ['user'] })
// Exclude sender
agent.onMessage(async (message, sender) => {
// Handles all except 'other-agent'
}, { senders: ['~other-agent'] })Message
Represents an incoming message with conversation context.
agent.onMessage(async (message: Message, sender: string) => {
// Core properties
message.id // Message ID
message.conversationId // Conversation ID
message.userId // User who sent the message
message.content // Message content
message.createdAt // Timestamp
// Metadata
message.metadata // Custom metadata from frontend
message.schema // JSON schema for form responses
// Attachments
message.hasAttachment() // Check if attachment exists
message.attachment // AttachmentInfo object
await message.getAttachmentBytes() // Download as Buffer
await message.getAttachmentBase64() // Download as base64 string
// Conversation history
await message.conversation?.getHistory({ limit: 10 })
await message.conversation?.getAgentHistory({ limit: 10 })
// Long-term memory
await message.memory?.queryAboutUser({ query: '...', limit: 5 })
await message.memory?.getUserSummary()
})Message Properties
| Property | Type | Description |
|----------|------|-------------|
| id | string | Message ID |
| conversationId | string | Conversation ID |
| userId | string | User who sent the message |
| content | string | Message content |
| createdAt | string | ISO timestamp |
| metadata | Record<string, unknown> \| undefined | Custom metadata |
| schema | Record<string, unknown> \| undefined | JSON schema for forms |
| attachment | AttachmentInfo \| undefined | File attachment metadata |
| conversation | Conversation \| undefined | Conversation helper |
| memory | Memory \| undefined | Long-term memory helper |
Streamer
Streams response chunks back to the user (or another agent).
agent.onMessage(async (message, sender) => {
// Basic streaming
const streamer = agent.streamer(message)
streamer.stream('Chunk 1...')
streamer.stream('Chunk 2...')
await streamer.finish()
// Streaming with options
const streamer2 = agent.streamer(message, {
sendTo: 'user', // or another agent name
awaiting: true, // Set awaiting_route to this agent
})
// Non-persisted content (not saved to DB)
streamer2.stream('Thinking...', { persist: false, eventType: 'thinking' })
// Persisted content
streamer2.stream('Here is my response...')
// Set response schema for forms
streamer2.setResponseSchema({
type: 'object',
properties: {
name: { type: 'string' }
}
})
// Set message metadata
streamer2.setMessageMetadata({
model: 'gpt-4o',
confidence: 0.95,
})
await streamer2.finish()
})Streamer Methods
| Method | Description |
|--------|-------------|
| stream(content, options?) | Stream a content chunk |
| write(content, persist?) | Alias for stream() |
| finish(awaiting?) | Finish streaming with done=true signal |
| setResponseSchema(schema) | Set JSON Schema for form rendering |
| setMessageMetadata(metadata) | Set message metadata |
| streamBy(structural) | Bind a structural streamer for artifacts, surfaces, etc. |
Structural Streaming
The SDK provides structural streaming components for rich content like artifacts, surfaces, text blocks, and steppers.
Artifact
Stream file artifacts (images, documents, iframes) with progress indication:
import { Artifact } from 'march-ai-sdk'
agent.onMessage(async (message, sender) => {
const streamer = agent.streamer(message)
// Create an artifact
const artifact = new Artifact()
// Signal generation started (optional)
streamer.streamBy(artifact).generating('Creating chart...', 0.5)
// ... do work ...
// Signal completion with artifact details
streamer.streamBy(artifact).done({
url: 'https://example.com/chart.png',
type: 'image',
title: 'Sales Chart',
description: 'Q4 2024 sales data',
metadata: { size: 1024, mimeType: 'image/png' }
})
streamer.stream('Here is your chart!')
await streamer.finish()
})Surface
Stream embedded interactive components (iframes, embeds):
import { Surface } from 'march-ai-sdk'
const surface = new Surface()
streamer.streamBy(surface).generating('Loading calendar...')
streamer.streamBy(surface).done({
url: 'https://cal.com/embed/user',
type: 'iframe', // default
title: 'Schedule a Meeting'
})TextBlock
Stream collapsible text content with titles and variants:
import { TextBlock } from 'march-ai-sdk'
const block = new TextBlock()
streamer.streamBy(block).setVariant('thinking')
streamer.streamBy(block).streamTitle('Analyzing...')
streamer.streamBy(block).streamBody('Step 1: Checking data\n')
streamer.streamBy(block).streamBody('Step 2: Processing\n')
streamer.streamBy(block).updateTitle('Analysis Complete')
streamer.streamBy(block).done()Variants: thinking, note, warning, error, success
Stepper
Stream multi-step progress indicators:
import { Stepper } from 'march-ai-sdk'
const stepper = new Stepper({ steps: ['Fetch Data', 'Process', 'Generate Report'] })
streamer.streamBy(stepper).startStep(0)
// ... do work ...
streamer.streamBy(stepper).completeStep(0)
streamer.streamBy(stepper).startStep(1)
streamer.streamBy(stepper).addStep('Verify') // Add dynamic step
streamer.streamBy(stepper).completeStep(1)
streamer.streamBy(stepper).startStep(2)
streamer.streamBy(stepper).completeStep(2)
streamer.streamBy(stepper).done()Structural Streaming Components
| Component | Purpose | Key Methods |
|-----------|---------|-------------|
| Artifact | Files, images, documents | generating(), done(), error() |
| Surface | Embedded iframes/widgets | generating(), done(), error() |
| TextBlock | Collapsible text content | streamTitle(), streamBody(), setVariant(), done() |
| Stepper | Multi-step progress | startStep(), completeStep(), failStep(), addStep(), done() |
Extensions
LangGraph Integration
Use HTTPCheckpointSaver to persist LangGraph state via the Conversation Store checkpoint API.
import { MarchAgentApp, Message } from 'march-ai-sdk'
import { HTTPCheckpointSaver } from 'march-ai-sdk/extensions/langgraph'
import { StateGraph, START, END, Annotation } from '@langchain/langgraph'
import { ChatOpenAI } from '@langchain/openai'
import { HumanMessage, AIMessage, BaseMessage } from '@langchain/core/messages'
const app = new MarchAgentApp({
gatewayUrl: process.env.GATEWAY_URL!,
apiKey: process.env.GATEWAY_API_KEY!,
})
// Create checkpointer
const checkpointer = new HTTPCheckpointSaver(app)
// Define state
const StateAnnotation = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
})
// Create graph
const llm = new ChatOpenAI({ model: 'gpt-4o-mini', streaming: true })
async function respond(state: typeof StateAnnotation.State) {
const response = await llm.invoke(state.messages)
return { messages: [response] }
}
const graph = new StateGraph(StateAnnotation)
.addNode('respond', respond)
.addEdge(START, 'respond')
.addEdge('respond', END)
const compiled = graph.compile({ checkpointer })
// Register agent
const agent = await app.registerMe({
name: 'langgraph-agent',
about: 'Agent using LangGraph for state management',
document: 'Uses LangGraph to maintain conversation state.',
})
agent.onMessage(async (message: Message, sender: string) => {
const config = { configurable: { thread_id: message.conversationId } }
// Add user message and invoke graph
const result = await compiled.invoke(
{ messages: [new HumanMessage(message.content)] },
config
)
// Stream response
const streamer = agent.streamer(message)
const lastMessage = result.messages[result.messages.length - 1]
streamer.stream(lastMessage.content as string)
await streamer.finish()
})
await app.run()Vercel AI SDK Integration
Use VercelAIMessageStore to persist message history for Vercel AI SDK.
import { MarchAgentApp, Message } from 'march-ai-sdk'
import { VercelAIMessageStore } from 'march-ai-sdk/extensions/vercel-ai'
import { streamText, CoreMessage } from 'ai'
import { openai } from '@ai-sdk/openai'
const app = new MarchAgentApp({
gatewayUrl: process.env.GATEWAY_URL!,
apiKey: process.env.GATEWAY_API_KEY!,
})
// Create message store
const store = new VercelAIMessageStore(app)
const agent = await app.registerMe({
name: 'vercel-ai-agent',
about: 'Agent using Vercel AI SDK',
document: 'Uses Vercel AI SDK for LLM interactions.',
})
agent.onMessage(async (message: Message, sender: string) => {
// Load previous messages
const history = await store.load(message.conversationId)
// Build messages array
const messages: CoreMessage[] = [
{ role: 'system', content: 'You are a helpful assistant.' },
...history,
{ role: 'user', content: message.content }
]
const streamer = agent.streamer(message)
// Stream response
const result = await streamText({
model: openai('gpt-4o'),
messages,
onChunk: ({ chunk }) => {
if (chunk.type === 'text-delta') {
streamer.stream(chunk.textDelta)
}
}
})
await streamer.finish()
// Save updated history
await store.save(message.conversationId, [
...history,
{ role: 'user', content: message.content },
{ role: 'assistant', content: result.text }
])
})
await app.run()VercelAIMessageStore Methods
| Method | Description |
|--------|-------------|
| load(conversationId) | Load message history |
| save(conversationId, messages) | Save message history |
| clear(conversationId) | Clear message history |
| append(conversationId, messages) | Append to existing history |
| getLastMessages(conversationId, count) | Get last N messages |
Advanced Usage
Inter-Agent Communication
Agents can send messages to other agents using the sendTo option.
// Handle user messages - forward to specialist
agent.onMessage(async (message, sender) => {
const streamer = agent.streamer(message, { sendTo: 'specialist-agent' })
streamer.stream('Forwarding your question to the specialist...')
await streamer.finish()
}, { senders: ['user'] })
// Handle specialist responses - send back to user
agent.onMessage(async (message, sender) => {
const streamer = agent.streamer(message, { sendTo: 'user' })
streamer.stream(`The specialist says: ${message.content}`)
await streamer.finish()
}, { senders: ['specialist-agent'] })Dynamic Forms with Response Schema
Request structured input from users using JSON Schema forms.
agent.onMessage(async (message, sender) => {
// Check if this is a form response
if (message.schema) {
const data = JSON.parse(message.content)
const streamer = agent.streamer(message)
streamer.stream(`Thank you, ${data.name}! I received your message.`)
await streamer.finish()
return
}
// Request form input
const streamer = agent.streamer(message, { awaiting: true })
streamer.setResponseSchema({
type: 'object',
title: 'Contact Information',
properties: {
name: { type: 'string', title: 'Full Name' },
email: { type: 'string', format: 'email', title: 'Email' },
message: { type: 'string', title: 'Message' }
},
required: ['name', 'email']
})
streamer.stream('Please fill out your contact information:')
await streamer.finish()
})Error Handling
Customize error handling behavior:
const app = new MarchAgentApp({
gatewayUrl: '...',
apiKey: '...',
errorMessageTemplate: 'Sorry, something went wrong. Please try again.',
})
const agent = await app.registerMe({ ... })
// Disable automatic error responses
agent.sendErrorResponses = false
agent.onMessage(async (message, sender) => {
try {
// Your logic
} catch (error) {
// Custom error handling
const streamer = agent.streamer(message)
streamer.stream(`I encountered an issue: ${error}`)
await streamer.finish()
}
})Project Structure
ai-ts-framework/
├── src/
│ ├── index.ts # Package exports
│ ├── app.ts # MarchAgentApp
│ ├── agent.ts # Agent class
│ ├── message.ts # Message class
│ ├── streamer.ts # Streamer class
│ ├── conversation.ts # Conversation helper
│ ├── memory.ts # Memory helper
│ ├── conversation-client.ts # HTTP client for Conversation Store
│ ├── checkpoint-client.ts # HTTP client for checkpoints
│ ├── agent-state-client.ts # HTTP client for agent state
│ ├── memory-client.ts # HTTP client for AI Memory
│ ├── attachment-client.ts # HTTP client for attachments
│ ├── gateway-client.ts # gRPC client for Agent Gateway
│ ├── heartbeat.ts # Heartbeat manager
│ ├── api-paths.ts # API endpoint configuration
│ ├── types.ts # TypeScript types
│ ├── exceptions.ts # Custom exceptions
│ ├── structural/ # Structural streaming components
│ │ ├── index.ts # Structural exports
│ │ ├── base.ts # StructuralStreamer base class
│ │ ├── artifact.ts # Artifact component
│ │ ├── surface.ts # Surface component
│ │ ├── text-block.ts # TextBlock component
│ │ └── stepper.ts # Stepper component
│ ├── proto/
│ │ └── gateway.proto # gRPC protocol definition
│ └── extensions/
│ ├── index.ts
│ ├── langgraph.ts # LangGraph HTTPCheckpointSaver
│ └── vercel-ai.ts # Vercel AI MessageStore
├── tests/
│ ├── agent.test.ts
│ ├── app.test.ts
│ ├── checkpoint-client.test.ts
│ ├── message.test.ts
│ ├── streamer.test.ts
│ └── structural-streaming.test.ts
├── tsup.config.ts # Build configuration
├── vitest.config.ts # Test configuration
├── tsconfig.json
├── package.json
└── README.mdDevelopment
Running Tests
# Install dependencies
pnpm install
# Run tests
pnpm test
# Run tests in watch mode
pnpm test:watch
# Run tests with coverage
pnpm test:coverageBuilding
# Build the package
pnpm build
# Type check without building
pnpm typecheckDependencies
| Package | Purpose | |---------|---------| | @grpc/grpc-js | gRPC communication with Agent Gateway | | @grpc/proto-loader | Protobuf loading | | zod | Schema validation |
Optional Dependencies
| Package | Install Command | Purpose |
|---------|-----------------|---------|
| @langchain/langgraph | pnpm add @langchain/langgraph | LangGraph state management |
| @langchain/langgraph-checkpoint | pnpm add @langchain/langgraph-checkpoint | Checkpoint interface |
| ai | pnpm add ai | Vercel AI SDK |
| @ai-sdk/openai | pnpm add @ai-sdk/openai | OpenAI provider for Vercel AI |
TypeScript Types
The SDK exports all types for TypeScript users:
import type {
// Core types
Message,
Streamer,
Agent,
MarchAgentApp,
// Structural streaming
StructuralStreamer,
Artifact,
Surface,
TextBlock,
Stepper,
// Options
AppOptions,
RegisterOptions,
StreamOptions,
StreamerOptions,
// Data types
KafkaMessage,
AgentRegistrationData,
AttachmentInfo,
ConversationData,
// Memory types
MemoryMessage,
MemorySearchResult,
UserSummary,
// Handler types
MessageHandler,
SenderFilterOptions,
} from 'march-ai-sdk'Exceptions
The SDK provides typed exceptions:
import {
MarchAgentError, // Base error class
RegistrationError, // Agent registration failed
KafkaError, // Kafka communication error
ConfigurationError, // Invalid configuration
APIException, // HTTP API error
HeartbeatError, // Heartbeat failed
GatewayError, // Gateway communication error
} from 'march-ai-sdk'License
MIT
