@majkapp/majk-chat-core
v1.0.27
Published
Core library for multi-provider LLM chat interactions
Maintainers
Readme
@majkapp/majk-chat-core
A powerful, provider-agnostic Node.js library for interacting with multiple LLM providers through a unified interface. Built with TypeScript, featuring a fluent API, plugin architecture, and comprehensive tool support.
Features
- 🌐 Multi-Provider Support: OpenAI, Anthropic, Azure OpenAI, AWS Bedrock
- 🔧 Tool Orchestration: Automatic tool execution with configurable permissions
- 📦 Content Blocks: Support for text, images, and documents
- 🔌 Plugin Architecture: Extensible design for custom providers and tools
- 🎯 Type-Safe: Full TypeScript support with comprehensive type definitions
- ⚡ Fluent API: Beautiful, chainable configuration
- 🔐 Flexible Authentication: Multiple credential sources (env vars, files, config)
- 🎭 Provider Abstraction: Write once, run with any provider
Installation
npm install @majkapp/majk-chat-coreQuick Start
Simple Setup
import { quickSetup } from '@majkapp/majk-chat-core';
// Quick setup with OpenAI (uses OPENAI_API_KEY env var)
const chat = quickSetup.openai();
// Send a message
const session = chat.createSession();
const response = await session.send('Hello, how are you?', {
model: 'gpt-4'
});
console.log(response.choices[0].message.content);Advanced Configuration
import { MajkChatBuilder } from '@majkapp/majk-chat-core';
const chat = new MajkChatBuilder()
// Add providers
.withOpenAI({ apiKey: 'your-api-key' })
.withAnthropic({ apiKey: 'your-anthropic-key' })
.withAzureOpenAI({
apiKey: 'your-azure-key',
endpoint: 'https://your-resource.openai.azure.com',
deploymentName: 'your-deployment'
})
.withBedrock({
region: 'us-east-1',
// Bearer token authentication (preferred)
bearerToken: 'bedrock-api-key-...',
// OR traditional AWS credentials
accessKeyId: 'your-access-key',
secretAccessKey: 'your-secret-key'
})
// Set default provider
.setDefaultProvider('anthropic')
// Add tools
.withTool(new CalculatorTool())
.withAutoToolExecution(5) // Max 5 tool execution steps
// Enable features
.withTranscript() // Record conversation transcript
.withEnvironmentCredentials() // Load from env vars
.build();Provider Support
OpenAI
const chat = new MajkChatBuilder()
.withOpenAI({
apiKey: process.env.OPENAI_API_KEY,
organizationId: process.env.OPENAI_ORG_ID
})
.build();
const session = chat.createSession('openai');
const response = await session.send('Generate a haiku', {
model: 'gpt-4',
temperature: 0.7,
max_tokens: 100
});Anthropic
const chat = new MajkChatBuilder()
.withAnthropic({
apiKey: process.env.ANTHROPIC_API_KEY
})
.build();
const session = chat.createSession('anthropic');
const response = await session.send('Explain quantum computing', {
model: 'claude-3-sonnet-20240229',
max_tokens: 1024
});Azure OpenAI
const chat = new MajkChatBuilder()
.withAzureOpenAI({
apiKey: process.env.AZURE_OPENAI_API_KEY,
endpoint: process.env.AZURE_OPENAI_ENDPOINT,
deploymentName: 'gpt-4'
})
.build();AWS Bedrock
Bearer Token Authentication (Recommended)
const chat = new MajkChatBuilder()
.withBedrock({
region: 'us-east-1',
bearerToken: 'bedrock-api-key-...' // Your Bedrock API key
})
.build();
const session = chat.createSession('anthropic-bedrock');
const response = await session.send('Hello!', {
model: 'anthropic.claude-3-sonnet-20240229-v1:0'
});Traditional AWS Credentials
const chat = new MajkChatBuilder()
.withBedrock({
region: 'us-east-1',
accessKeyId: 'AKIA...',
secretAccessKey: 'your-secret-key',
sessionToken: 'optional-session-token' // For temporary credentials
})
.build();Environment Variables
// Set environment variables:
// AWS_BEARER_TOKEN_BEDROCK (for API keys)
// OR AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION (for traditional credentials)
const chat = new MajkChatBuilder()
.withBedrock() // Automatically uses environment variables
.withEnvironmentCredentials()
.build();Content Blocks
Support for multi-modal content including text, images, and documents:
const response = await session.send('', {
messages: [
{
role: 'user',
content: [
{ type: 'text', text: 'What do you see in this image?' },
{
type: 'image',
source: {
type: 'url',
media_type: 'image/jpeg',
data: 'https://example.com/image.jpg'
}
}
]
}
]
});Tool Support
Creating Custom Tools
import { ToolExecutor, ToolDefinition, ToolResult } from '@majkapp/majk-chat-core';
class WeatherTool implements ToolExecutor {
readonly name = 'get_weather';
readonly definition: ToolDefinition = {
type: 'function',
function: {
name: 'get_weather',
description: 'Get the current weather for a location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'City and state, e.g. San Francisco, CA'
}
},
required: ['location']
}
},
serverInvocable: true,
sideEffects: 'read'
};
async execute(args: any, context: RequestContext): Promise<ToolResult> {
// Implement weather fetching logic
return {
success: true,
output: {
temperature: 72,
conditions: 'Sunny',
location: args.location
}
};
}
validateArgs(args: any): boolean {
return args && typeof args.location === 'string';
}
}
// Use the tool
const chat = new MajkChatBuilder()
.withOpenAI()
.withTool(new WeatherTool())
.withAutoToolExecution(3)
.build();Tool Orchestration
Enable automatic tool execution:
const chat = new MajkChatBuilder()
.withOpenAI()
.withTools(calculatorTool, weatherTool, searchTool)
.withAutoToolExecution(5) // Max 5 steps
.withTranscript() // Record tool execution transcript
.build();
const response = await session.send(
'What is the weather in NYC and calculate 15% tip on $85?',
{ max_steps: 3 }
);
// Access the transcript
console.log(response.transcript);Session Management
const session = chat.createSession();
// Send messages
const response1 = await session.send('Hello!');
const response2 = await session.send('Tell me a joke');
// Get conversation history
const history = session.getHistory();
// Clear history
session.clearHistory();Credential Management
The core library supports multiple credential sources for maximum flexibility.
Direct Configuration
// Direct API key configuration
const chat = new MajkChatBuilder()
.withOpenAI({
apiKey: 'sk-your-key',
organizationId: 'org-your-org'
})
.withAnthropic({
apiKey: 'sk-ant-your-key'
})
.withAzureOpenAI({
apiKey: 'your-azure-key',
endpoint: 'https://your-resource.openai.azure.com',
deploymentName: 'gpt-4'
})
.withBedrock({
// Bearer token (preferred)
bearerToken: 'bedrock-api-key-...',
region: 'us-east-1'
// OR traditional credentials
// accessKeyId: 'AKIA...',
// secretAccessKey: 'your-secret-key'
})
.build();Environment Variables
// Automatically loads from standard env vars:
// - OPENAI_API_KEY, OPENAI_ORG_ID
// - ANTHROPIC_API_KEY
// - AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_DEPLOYMENT
// - AWS_BEARER_TOKEN_BEDROCK (for Bedrock API keys)
// - AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION (for traditional AWS credentials)
const chat = new MajkChatBuilder()
.withOpenAI()
.withBedrock()
.withEnvironmentCredentials()
.build();CLI Integration with Profiles
When using the CLI, the core library automatically receives credentials from the CLI's AWS-style profile system:
# CLI handles profile resolution and passes credentials to core library
majk-chat chat --profile work -M "Hello" # CLI resolves work profile credentialsThe core library receives the resolved credentials transparently, so no code changes are needed to benefit from the profile system.
Custom Credential Sources
import { CredentialSource } from '@majkapp/majk-chat-core';
class CustomCredentialSource implements CredentialSource {
async loadCredentials(provider: Provider): Promise<ProviderConfig | null> {
// Load from your custom source
return {
apiKey: await getKeyFromVault(provider)
};
}
}
const chat = new MajkChatBuilder()
.withOpenAI()
.withCredentialSource(new CustomCredentialSource())
.build();Custom Providers
import { BaseProvider, ProviderAdapter } from '@majkapp/majk-chat-core';
class CustomProvider extends BaseProvider {
readonly id = 'custom' as Provider;
readonly name = 'Custom Provider';
async chatCompletion(request: ChatRequest, context: RequestContext): Promise<ChatResponse> {
// Implement your provider logic
}
async listModels(context: RequestContext): Promise<ModelsResponse> {
// Return available models
}
estimateTokens(text: string, model: string): number {
return Math.ceil(text.length / 4);
}
supportsFeature(feature: ProviderFeature): boolean {
// Return feature support
}
}
const chat = new MajkChatBuilder()
.withProvider(new CustomProvider())
.build();Error Handling
try {
const response = await session.send('Hello');
} catch (error) {
if (error.type === 'rate_limit_error') {
// Handle rate limiting
if (error.retryable) {
// Retry after delay
}
} else if (error.type === 'authentication_error') {
// Handle auth errors
}
}Testing
The library includes comprehensive test utilities:
import { MajkChatBuilder } from '@majkapp/majk-chat-core';
describe('My Chat Integration', () => {
it('should send messages', async () => {
const chat = new MajkChatBuilder()
.withOpenAI({ apiKey: 'test-key' })
.build();
const session = chat.createSession();
// Mock the provider response
const response = await session.send('Test message');
expect(response.choices[0].message.content).toBeDefined();
});
});API Reference
MajkChatBuilder
withOpenAI(config?): Add OpenAI providerwithAnthropic(config?): Add Anthropic providerwithAzureOpenAI(config?): Add Azure OpenAI providerwithBedrock(config?): Add AWS Bedrock providerwithProvider(provider): Add custom providersetDefaultProvider(provider): Set default providerwithTool(tool): Add a toolwithTools(...tools): Add multiple toolswithAutoToolExecution(maxSteps): Enable auto tool executionwithTranscript(): Enable transcript recordingwithPermissionHandler(handler): Set permission handlerwithCredentialSource(source): Add credential sourcewithEnvironmentCredentials(): Use environment variableswithFileCredentials(path?): Use file-based credentialsbuild(): Build the chat instance
MajkChat
getProvider(id?): Get a provider by IDcreateSession(provider?): Create a chat sessionlistProviders(): List available providersgetDefaultProvider(): Get default provider
ChatSession
send(message, options?): Send a messagegetHistory(): Get message historyclearHistory(): Clear message history
License
MIT
Contributing
Contributions are welcome! Please read our contributing guidelines and submit pull requests to our repository.
Support
For issues, questions, or suggestions, please file an issue on our GitHub repository.
