backboard-sdk
v1.4.12
Published
JavaScript SDK for the Backboard API - Build conversational AI applications with persistent memory and intelligent document processing
Maintainers
Readme
Backboard JavaScript SDK
A developer-friendly JavaScript SDK for the Backboard API. Build conversational AI applications with persistent memory and intelligent document processing.
New to Backboard? We include $5 in free credits to get you started and support 1,800+ LLMs across major providers.
New in v1.4.11
- List threads for an assistant:
client.listThreadsForAssistant(assistantId)(maps toGET /assistants/{assistant_id}/threads)
Installation
npm install backboard-sdkOr with yarn:
yarn add backboard-sdkTypeScript
This package now ships first-class TypeScript types. Simply install and import; typings are included with the published package. No separate @types install is required.
Quick Start
import { BackboardClient } from 'backboard-sdk';
// Initialize the client
const client = new BackboardClient({
apiKey: 'your_api_key_here'
});
// Create an assistant
const assistant = await client.createAssistant({
name: 'Support Bot',
system_prompt: 'You are a helpful customer support assistant'
});
// Create a conversation thread
const thread = await client.createThread(assistant.assistantId);
// Send a message
const response = await client.addMessage(thread.threadId, {
content: 'Hello! Can you help me with my account?',
llm_provider: 'openai',
model_name: 'gpt-4o'
});
console.log(response.content);TypeScript usage
Types are bundled; you can rely on autocompletion and type checking:
import { BackboardClient, MessageRole } from 'backboard-sdk';
const client = new BackboardClient({ apiKey: process.env.BACKBOARD_API_KEY! });
const thread = await client.createThread('assistant-id');
const resp = await client.addMessage(thread.threadId, { content: 'Ping?', stream: false });
if (resp.role === MessageRole.ASSISTANT) {
console.log(resp.content);
}Features
Memory (NEW in v1.4.0)
- Persistent Memory: Store and retrieve information across conversations
- Automatic Context: Enable memory to automatically search and use relevant context
- Manual Management: Full control with add, update, delete, and list operations
- Memory Modes: Auto (search + write), Readonly (search only), or off
Assistants
- Create, list, get, update, and delete assistants
- Configure custom tools and capabilities
- Upload documents for assistant-level context
Threads
- Create conversation threads under assistants
- Maintain persistent conversation history
- Support for message attachments
Documents
- Upload documents to assistants or threads
- Automatic processing and indexing for RAG
- Support for PDF, Office files, text, and more
- Real-time processing status tracking
Messages
- Send messages with optional file attachments
- Streaming and non-streaming responses
- Tool calling support
- Custom LLM provider and model selection
API Reference
Client Initialization
import { BackboardClient } from 'backboard-sdk';
const client = new BackboardClient({
apiKey: 'your_api_key'
});Assistants
// Create assistant
const assistant = await client.createAssistant({
name: 'My Assistant',
system_prompt: 'System prompt that guides your assistant',
tools: [toolDefinition], // Optional
// Embedding configuration (optional - defaults to OpenAI text-embedding-3-large with 3072 dims)
embedding_provider: 'cohere', // Optional: openai, google, cohere, etc.
embedding_model_name: 'embed-english-v3.0', // Optional
embedding_dims: 1024 // Optional
});
// List assistants
const assistants = await client.listAssistants({ skip: 0, limit: 100 });
// Get assistant
const assistant = await client.getAssistant(assistantId);
// Update assistant
const assistant = await client.updateAssistant(assistantId, {
name: 'New Name',
system_prompt: 'Updated system prompt'
});
// Delete assistant
const result = await client.deleteAssistant(assistantId);Threads
// Create thread
const thread = await client.createThread(assistantId);
// List threads for a specific assistant
const assistantThreads = await client.listThreadsForAssistant(assistantId, { skip: 0, limit: 100 });
// List threads
const threads = await client.listThreads({ skip: 0, limit: 100 });
// Get thread with messages
const thread = await client.getThread(threadId);
// Delete thread
const result = await client.deleteThread(threadId);Messages
// Send message
const response = await client.addMessage(threadId, {
content: 'Your message here',
files: ['path/to/file.pdf'], // Optional attachments
llm_provider: 'openai', // Optional
model_name: 'gpt-4o', // Optional
stream: false, // Set to true for streaming
memory: 'Auto', // Optional: "Auto", "Readonly", or "off" (default)
role: 'user' // Optional: "user" (default) or "assistant"
});
// Streaming messages
const stream = await client.addMessage(threadId, {
content: 'Hello',
stream: true
});
for await (const chunk of stream) {
if (chunk.type === 'content_streaming') {
process.stdout.write(chunk.content || '');
}
}
// Import conversation history (add assistant messages without LLM invocation)
await client.addMessage(threadId, {
content: 'This was the assistant\'s previous response',
role: 'assistant' // No LLM invocation when role is "assistant"
});Memory
// Add a memory
await client.addMemory(assistantId, {
content: 'User prefers JavaScript programming',
metadata: { category: 'preference' }
});
// Get all memories
const memories = await client.getMemories(assistantId);
for (const memory of memories.memories) {
console.log(`${memory.id}: ${memory.content}`);
}
// Get specific memory
const memory = await client.getMemory(assistantId, memoryId);
// Update memory
await client.updateMemory(assistantId, memoryId, {
content: 'Updated content'
});
// Delete memory
await client.deleteMemory(assistantId, memoryId);
// Get memory stats
const stats = await client.getMemoryStats(assistantId);
console.log(`Total memories: ${stats.totalMemories}`);
// Use memory in conversation
const response = await client.addMessage(threadId, {
content: 'What do you know about me?',
memory: 'Auto' // Enable memory search and automatic updates
});Tool Integration (Simplified in v1.3.3)
Tool Definitions
// Use plain JSON objects (no verbose SDK classes needed!)
const tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City name"}
},
"required": ["location"]
}
}
}
];
const assistant = await client.createAssistant({
name: 'Weather Assistant',
system_prompt: 'You are a helpful weather assistant',
tools: tools
});Tool Call Handling
// Enhanced object-oriented access with automatic JSON parsing
const response = await client.addMessage(threadId, {
content: "What's the weather in San Francisco?"
});
if (response.status === 'REQUIRES_ACTION' && response.toolCalls) {
const toolOutputs = [];
// Process each tool call
for (const tc of response.toolCalls) {
if (tc.function.name === 'get_current_weather') {
// Get parsed arguments (required parameters are guaranteed by API)
const args = tc.function.parsedArguments;
const location = args.location;
// Execute your function and format the output
const weatherData = {
temperature: '68°F',
condition: 'Sunny',
location: location
};
toolOutputs.push({
tool_call_id: tc.id,
output: JSON.stringify(weatherData)
});
}
}
// Submit the tool outputs back to continue the conversation
const finalResponse = await client.submitToolOutputs(
threadId,
response.runId,
toolOutputs
);
console.log(finalResponse.content);
}Documents
// Upload document to assistant
const document = await client.uploadDocumentToAssistant(
assistantId,
'path/to/document.pdf'
);
// Upload document to thread
const document = await client.uploadDocumentToThread(
threadId,
'path/to/document.pdf'
);
// List assistant documents
const documents = await client.listAssistantDocuments(assistantId);
// List thread documents
const documents = await client.listThreadDocuments(threadId);
// Get document status
const document = await client.getDocumentStatus(documentId);
// Delete document
const result = await client.deleteDocument(documentId);Error Handling
The SDK includes comprehensive error handling:
import {
BackboardAPIError,
BackboardValidationError,
BackboardNotFoundError,
BackboardRateLimitError,
BackboardServerError
} from 'backboard-sdk';
try {
const assistant = await client.getAssistant('invalid_id');
} catch (error) {
if (error instanceof BackboardNotFoundError) {
console.log('Assistant not found');
} else if (error instanceof BackboardValidationError) {
console.log(`Validation error: ${error.message}`);
} else if (error instanceof BackboardAPIError) {
console.log(`API error: ${error.message}`);
}
}Advanced Tool Example
Here's a more comprehensive tool definition example:
const weatherTool = {
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather for a location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA'
},
unit: {
type: 'string',
enum: ['celsius', 'fahrenheit'],
description: 'Temperature unit'
}
},
required: ['location']
}
}
};
const assistant = await client.createAssistant({
name: 'Weather Assistant',
system_prompt: 'You are an AI assistant that can check weather information',
tools: [weatherTool]
});
// Handle tool calls with the new simplified approach
const response = await client.addMessage(threadId, {
content: "What's the weather in San Francisco?"
});
if (response.toolCalls) {
for (const tc of response.toolCalls) {
if (tc.function.name === 'get_weather') {
const { location, unit } = tc.function.parsedArguments;
const weather = await getWeatherData(location, unit);
await client.submitToolOutputs(threadId, response.runId, [{
tool_call_id: tc.id,
output: JSON.stringify(weather)
}]);
}
}
}Supported File Types
The SDK supports uploading the following file types:
- PDF files (.pdf)
- Microsoft Office files (.docx, .xlsx, .pptx, .doc, .xls, .ppt)
- Text files (.txt, .csv, .md, .markdown)
- Code files (.py, .js, .html, .css, .xml)
- JSON files (.json, .jsonl)
Requirements
- Node.js 16.0.0 or higher
- ES modules support
- TypeScript users:
npm run buildemitsdist/with.d.tsfiles
Local Development
# install deps
npm install
# build TypeScript -> dist/ (runs on publish via npm prepare)
npm run build
# lint/tests (if present)
npm run lint
npm testLicense
MIT License - see LICENSE file for details.
Support
- Documentation: https://app.backboard.io/docs
- Email: [email protected]
