@bodhiapp/bodhijs
v0.0.6
Published
A JavaScript library for interacting with local LLM capabilities via Bodhi Browser Extension
Maintainers
Readme
@bodhiapp/bodhijs
A TypeScript/JavaScript library for interacting with local LLM capabilities via the Bodhi Browser Extension. This library provides a clean, typed interface for extension detection, API communication, and streaming responses.
Features
- 🔍 Extension Detection: Automatic detection and initialization of the Bodhi browser extension
- 🌊 Streaming Support: Real-time streaming responses for chat completions and other long-running operations
- 🛡️ Type Safety: Comprehensive TypeScript support with detailed type definitions
- 🔄 OpenAI Compatible: Chat completion API compatible with OpenAI's format for easy migration
- ⚡ Error Handling: Robust error handling with specific error types for different failure scenarios
- 🧪 Well Tested: Comprehensive integration tests with real browser automation
Installation
Install the library via npm:
npm install @bodhiapp/bodhijsQuick Start
Basic Usage
import { loadBodhiExtClient, ExtensionNotFoundError } from '@bodhiapp/bodhijs';
async function main() {
try {
// Initialize the client (detects extension and gets ID with default 10s timeout)
const client = await loadBodhiExtClient();
// Test connectivity
const ping = await client.ping();
console.log('Server status:', ping.message);
// Make a simple API request
const response = await client.sendApiRequest('POST', '/v1/chat/completions', {
model: 'llama3',
messages: [{ role: 'user', content: 'What is the capital of France?' }],
});
console.log('Response:', response.body);
} catch (error) {
if (error instanceof ExtensionNotFoundError) {
console.log('Please install the Bodhi browser extension');
} else {
console.error('Error:', error.message);
}
}
}
main();Streaming Responses
For real-time streaming responses (great for chat applications):
import { loadBodhiExtClient } from '@bodhiapp/bodhijs';
async function streamingChat() {
const client = await loadBodhiExtClient();
// Create a streaming request
const stream = await client.sendStreamRequest('POST', '/v1/chat/completions', {
model: 'llama3',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Write a short poem about TypeScript' },
],
stream: true,
temperature: 0.7,
});
// Process chunks as they arrive
console.log('Assistant: ');
for await (const chunk of stream) {
if (chunk.body?.choices?.[0]?.delta?.content) {
process.stdout.write(chunk.body.choices[0].delta.content);
}
}
console.log('\n');
}
streamingChat();Error Handling
The library provides specific error types for different scenarios:
import { loadBodhiExtClient, ExtensionNotFoundError, ExtensionTimeoutError } from '@bodhiapp/bodhijs';
async function robustInitialization() {
try {
// Try to load with a custom timeout (15 seconds)
const client = await loadBodhiExtClient({ timeout: 15000 });
// Verify server connectivity
await client.ping();
return client;
} catch (error) {
if (error instanceof ExtensionNotFoundError) {
// Extension not installed or not enabled
showInstallationInstructions();
} else if (error instanceof ExtensionTimeoutError) {
// Extension found but not ready
showRetryOption();
} else {
// Other errors (network, server, etc.)
showGenericError(error);
}
throw error;
}
}
function showInstallationInstructions() {
console.log(`
The Bodhi browser extension is required but not detected.
Please:
1. Install the extension from the Chrome Web Store
2. Enable the extension in your browser settings
3. Refresh this page
`);
}
function showRetryOption() {
console.log(`
Extension found but not ready. This can happen if:
1. The extension is still initializing
2. The page loaded before the extension was ready
Please refresh the page or try again.
`);
}API Reference
Core Classes
BodhiExtClient
The main client class for interacting with the browser extension.
class BodhiExtClient {
constructor(extensionId: string);
getExtensionId(): string;
sendApiRequest(
method: string,
endpoint: string,
body?: any,
headers?: Record<string, string>
): Promise<ApiResponse>;
sendStreamRequest(
method: string,
endpoint: string,
body?: any,
headers?: Record<string, string>
): Promise<AsyncIterable<StreamChunk>>;
ping(): Promise<{ message: string }>;
}Methods:
getExtensionId(): Returns the cached extension IDsendApiRequest(): Sends a standard HTTP request through the extensionsendStreamRequest(): Sends a streaming HTTP request, returns an AsyncIterableping(): Tests connectivity with the extension and backend server
Factory Functions
loadBodhiExtClient(config?: BodhiExtClientConfig): Promise<BodhiExtClient>
The recommended way to create a BodhiExtClient instance. Handles extension detection and initialization.
Parameters:
config(optional): Configuration object with the following options:timeout(optional): Maximum time to wait for extension detection in milliseconds (default: 10000)
Returns: Promise resolving to a fully initialized BodhiExtClient instance
Throws:
ExtensionNotFoundError: If extension is not detected within timeoutExtensionTimeoutError: If extension is detected but fails to provide ID within timeout
Error Classes
ExtensionNotFoundError
Thrown when the browser extension is not detected within the specified timeout.
class ExtensionNotFoundError extends Error {
constructor(public timeout: number)
}ExtensionTimeoutError
Thrown when the extension is detected but fails to provide its ID within the timeout.
class ExtensionTimeoutError extends Error {
constructor(public timeout: number)
}Type Definitions
ApiResponse
interface ApiResponse {
body: any; // Response body data
headers: Record<string, string>; // HTTP headers
status: number; // HTTP status code
}StreamChunk
interface StreamChunk {
body: any; // Chunk body data
headers?: Record<string, string>; // HTTP headers (first chunk only)
status?: number; // HTTP status code (first chunk only)
}ChatRequest (OpenAI Compatible)
interface ChatRequest {
model: string; // Model identifier
messages: ChatMessage[]; // Conversation messages
stream?: boolean; // Enable streaming
temperature?: number; // Randomness (0.0-2.0)
max_tokens?: number; // Maximum tokens to generate
top_p?: number; // Nucleus sampling (0.0-1.0)
frequency_penalty?: number; // Frequency penalty (-2.0-2.0)
presence_penalty?: number; // Presence penalty (-2.0-2.0)
stop?: string | string[]; // Stop sequences
}ChatMessage
interface ChatMessage {
role: 'system' | 'user' | 'assistant';
content: string;
}Usage Patterns
Chat Application Pattern
import { loadBodhiExtClient, type ChatMessage } from '@bodhiapp/bodhijs';
class ChatApp {
private client: BodhiExtClient | null = null;
private messages: ChatMessage[] = [];
async initialize() {
try {
this.client = await loadBodhiExtClient();
await this.client.ping(); // Verify connectivity
console.log('Chat app ready!');
} catch (error) {
console.error('Failed to initialize:', error.message);
throw error;
}
}
async sendMessage(content: string): Promise<string> {
if (!this.client) throw new Error('Client not initialized');
// Add user message
this.messages.push({ role: 'user', content });
// Get streaming response
const stream = await this.client.sendStreamRequest('POST', '/v1/chat/completions', {
model: 'llama3',
messages: this.messages,
stream: true,
temperature: 0.7,
});
let assistantResponse = '';
for await (const chunk of stream) {
const content = chunk.body?.choices?.[0]?.delta?.content;
if (content) {
assistantResponse += content;
this.onStreamChunk(content); // Update UI in real-time
}
}
// Add assistant response to conversation
this.messages.push({ role: 'assistant', content: assistantResponse });
return assistantResponse;
}
private onStreamChunk(content: string) {
// Update your UI here
console.log(content);
}
}Generic API Client Pattern
import { loadBodhiExtClient, type ApiResponse } from '@bodhiapp/bodhijs';
class BodhiApiClient {
private client: BodhiExtClient;
constructor(client: BodhiExtClient) {
this.client = client;
}
static async create(): Promise<BodhiApiClient> {
const client = await loadBodhiExtClient();
return new BodhiApiClient(client);
}
async get<T = any>(endpoint: string, headers?: Record<string, string>): Promise<T> {
const response = await this.client.sendApiRequest('GET', endpoint, undefined, headers);
return response.body;
}
async post<T = any>(endpoint: string, data?: any, headers?: Record<string, string>): Promise<T> {
const response = await this.client.sendApiRequest('POST', endpoint, data, {
'Content-Type': 'application/json',
...headers,
});
return response.body;
}
async postStream(endpoint: string, data?: any, headers?: Record<string, string>) {
return this.client.sendStreamRequest('POST', endpoint, data, {
'Content-Type': 'application/json',
...headers,
});
}
async healthCheck(): Promise<boolean> {
try {
await this.client.ping();
return true;
} catch {
return false;
}
}
}
// Usage
const api = await BodhiApiClient.create();
const models = await api.get('/v1/models');
console.log('Available models:', models);Best Practices
1. Always Handle Extension Detection Errors
// ✅ Good
try {
const client = await loadBodhiExtClient();
// Use client...
} catch (error) {
if (error instanceof ExtensionNotFoundError) {
// Show installation instructions
}
// Handle error appropriately
}
// ❌ Bad
const client = await loadBodhiExtClient(); // Unhandled errors2. Test Connectivity Before Making Requests
// ✅ Good
const client = await loadBodhiExtClient();
await client.ping(); // Verify server is reachable
const response = await client.sendApiRequest('POST', '/v1/chat/completions', data);
// ❌ Bad
const client = await loadBodhiExtClient();
const response = await client.sendApiRequest('POST', '/v1/chat/completions', data); // May fail if server is down3. Use Appropriate Timeouts
// ✅ Good - Custom timeout for slow networks
const client = await loadBodhiExtClient({ timeout: 15000 }); // 15 seconds
// ✅ Good - Quick timeout for fast feedback
const client = await loadBodhiExtClient({ timeout: 5000 }); // 5 seconds
// ❌ Bad - Very short timeout may cause false negatives
const client = await loadBodhiExtClient({ timeout: 1000 }); // 1 second (too short)4. Handle Streaming Errors Gracefully
// ✅ Good
try {
const stream = await client.sendStreamRequest('POST', '/v1/chat/completions', data);
for await (const chunk of stream) {
// Process chunk
}
} catch (error) {
console.error('Streaming failed:', error.message);
// Fallback to non-streaming request
const response = await client.sendApiRequest('POST', '/v1/chat/completions', {
...data,
stream: false,
});
}Development
Building the Library
# Install dependencies
npm install
# Build the library
npm run build
# Build in watch mode for development
npm run devTesting
The library includes comprehensive integration tests that run against real browser instances with the extension loaded:
# Run all tests
npm test
# Run tests in watch mode
npm run test:watch
# Run linting
npm run lint
# Fix linting issues
npm run lint:fix
# Run validation (lint + test)
npm run validateProject Structure
bodhi-js/
├── src/
│ ├── index.ts # Main entry point and exports
│ ├── core.ts # BodhiExtClient class and factory function
│ └── types.ts # TypeScript type definitions
├── tests/ # Integration tests
├── dist/ # Built library files
└── README.md # This fileBrowser Compatibility
- Chrome/Chromium-based browsers (Chrome, Edge, Brave, etc.)
- Requires the Bodhi browser extension to be installed and enabled
- Modern JavaScript features (ES2020+)
- TypeScript 4.5+ for development
Migration from Legacy API
If you're migrating from an older version of the library that used function-based exports:
Before (Legacy API)
import { isInstalled, ping, chat } from '@bodhiapp/bodhijs';
if (isInstalled()) {
const response = await chat.completions.create({...});
}After (New BodhiExtClient API)
import { loadBodhiExtClient, ExtensionNotFoundError } from '@bodhiapp/bodhijs';
try {
const client = await loadBodhiExtClient();
const response = await client.sendApiRequest('POST', '/v1/chat/completions', {...});
} catch (error) {
if (error instanceof ExtensionNotFoundError) {
// Handle extension not found
}
}Contributing
We welcome contributions! Please see our contributing guidelines for more information.
License
MIT License - see LICENSE file for details.
