llmjs2
v1.1.1
Published
Abstract layer for LLM completion supporting multiple providers
Maintainers
Readme
llmjs2
A lightweight llm Node.js library for building simple / personal AI applications
Supported Providers
- Ollama - Connect to Ollama's cloud API
- OpenRouter - Access multiple LLM models through OpenRouter
Installation
npm install llmjs2Usage
llmjs2 supports three calling conventions:
Simple API (Auto-Detection)
import { completion } from 'llmjs2';
// Just provide a prompt - the library handles the rest
const result = await completion('Explain the use of llmjs2');
// Or provide a model and prompt
const result = await completion('ollama/minimax-m2.5:cloud', 'Explain the use of llmjs2');How it works:
- Looks for
OLLAMA_API_KEYandOPEN_ROUTER_API_KEYenvironment variables - If only one is set, uses that provider
- If both are set, randomly chooses one
- Uses
OLLAMA_DEFAULT_MODELor defaults tominimax-m2.5:cloudfor Ollama - Uses
OPEN_ROUTER_DEFAULT_MODELor defaults toopenrouter/freefor OpenRouter - If a model is provided, uses that model instead of the default
Function-Based API
import { completion } from 'llmjs2';
// Using Ollama
const resultOllama = await completion('ollama/minimax-m2.5:cloud', 'Explain the use of llmjs2', 'your-api-key');
// Using OpenRouter
const resultOR = await completion('openrouter/openrouter/free', 'Explain the use of llmjs2', 'your-api-key');Object-Based API
import { completion } from 'llmjs2';
// Using Ollama with system message
const resultOllama = await completion({
model: 'ollama/minimax-m2.5:cloud',
messages: [
{ role: 'system', content: 'You are a helpful AI assistant.' },
{ role: 'user', content: 'Explain the use of llmjs2.' }
],
apiKey: 'your-api-key' // optional
});
// Using OpenRouter with system message
const resultOR = await completion({
model: 'openrouter/openrouter/free',
messages: [
{ role: 'system', content: 'You are a helpful AI assistant.' },
{ role: 'user', content: 'Explain the use of llmjs2.' }
],
apiKey: 'your-api-key' // optional
});Tools Support
llmjs2 supports function calling (tools) through the object-based API:
import { completion } from 'llmjs2';
const result = await completion({
model: 'openrouter/openrouter/free',
messages: [
{ role: 'user', content: 'What is the weather like in Paris?' }
],
tools: [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA'
},
unit: {
type: 'string',
enum: ['celsius', 'fahrenheit'],
description: 'The temperature unit to use'
}
},
required: ['location']
}
}
}
]
});
// Result when tools are used:
// {
// content: '',
// tool_calls: [
// {
// id: 'call_123',
// type: 'function',
// function: {
// name: 'get_weather',
// arguments: '{"location": "Paris, France"}'
// }
// }
// ]
// }API Key Configuration
You can provide API keys in four ways:
1. Simple API (Environment Variables)
export OLLAMA_API_KEY=your-ollama-api-key
export OPEN_ROUTER_API_KEY=your-openrouter-api-key
# Optional: Set default models
export OLLAMA_DEFAULT_MODEL=minimax-m2.5:cloud
export OPEN_ROUTER_DEFAULT_MODEL=openrouter/freeconst result = await completion('Your prompt');2. Direct Parameter (Function API)
const result = await completion('ollama/minimax-m2.5:cloud', 'Your prompt', 'your-api-key');3. Object Property (Object API)
const result = await completion({
model: 'ollama/minimax-m2.5:cloud',
messages: [{ role: 'user', content: 'Your prompt' }],
apiKey: 'your-api-key'
});4. Environment Variables (Function/Object API)
export OLLAMA_API_KEY=your-ollama-api-key
export OPEN_ROUTER_API_KEY=your-openrouter-api-key// Function API
const result = await completion('ollama/minimax-m2.5:cloud', 'Your prompt');
// Object API
const result = await completion({
model: 'ollama/minimax-m2.5:cloud',
messages: [{ role: 'user', content: 'Your prompt' }]
});Model Format
Models must be specified in the format: provider/model_name
The provider is the text before the first /, and the model name is everything after it.
Examples:
ollama/minimax-m2.5:cloudollama/llama2openrouter/openrouter/freeopenrouter/meta-llama/llama-2-70b-chat
Messages Format (Object API)
The messages parameter is an array of message objects with the following structure:
[
{ role: 'system', content: 'You are a helpful AI assistant.' },
{ role: 'user', content: 'What is the capital of France?' },
{ role: 'assistant', content: 'The capital of France is Paris.' },
{ role: 'user', content: 'What is its population?' }
]Supported roles:
system- System instructionsuser- User messagesassistant- Assistant responses
Tools Format (Object API)
The tools parameter is an array of tool definitions:
[
{
type: 'function',
function: {
name: 'function_name',
description: 'Description of what the function does',
parameters: {
type: 'object',
properties: {
param1: {
type: 'string',
description: 'Description of parameter'
}
},
required: ['param1']
}
}
}
]Error Handling
The library throws descriptive errors for:
- Missing or invalid parameters
- Missing API keys
- API request failures
- Invalid response formats
- Request timeouts (60 seconds)
- Invalid tools format
try {
const result = await completion('Your prompt');
} catch (error) {
console.error('Completion failed:', error.message);
}Example Programs
Main Example
A real usage test program is included in example.js. To run it:
# Set your API keys
export OLLAMA_API_KEY=your-ollama-api-key
export OPEN_ROUTER_API_KEY=your-openrouter-api-key
# Run the example
node example.jsThe example program will:
- Test simple API (auto-detection)
- Test simple API with model
- Test Ollama with function-based API
- Test Ollama with object-based API
- Test Ollama with tools
- Test OpenRouter with function-based API
- Test OpenRouter with object-based API
- Test OpenRouter with tools
- Display results and test summary
API Reference
completion(prompt)
Simple API (Prompt Only)
Parameters:
prompt(string): The prompt to send to the LLM
Returns:
Promise<string>: The completion result
Behavior:
- Auto-detects provider based on available API keys
- Uses
OLLAMA_DEFAULT_MODELor defaults tominimax-m2.5:cloudfor Ollama - Uses
OPEN_ROUTER_DEFAULT_MODELor defaults toopenrouter/freefor OpenRouter - Randomly chooses provider if both API keys are set
completion(model, prompt)
Simple API (Model and Prompt)
Parameters:
model(string): Model identifier in format "provider/model_name"prompt(string): The prompt to send to the LLM
Returns:
Promise<string>: The completion result
Behavior:
- Auto-detects provider based on available API keys
- Uses the provided model instead of the default
- Randomly chooses provider if both API keys are set
completion(model, prompt, apiKey)
Function-Based API
Parameters:
model(string): Model identifier in format "provider/model_name"prompt(string): The prompt to send to the LLMapiKey(string, optional): API key (falls back to environment variables)
Returns:
Promise<string>: The completion result
completion(options)
Object-Based API
Parameters:
options(object): Configuration objectmodel(string): Model identifier in format "provider/model_name"messages(array): Array of message objects with role and contentapiKey(string, optional): API key (falls back to environment variables)tools(array, optional): Array of tool definitions
Returns:
Promise<string|object>: The completion result (string or object with tool calls)
Throws:
- Error if model format is invalid
- Error if prompt/messages is missing
- Error if API key is not provided
- Error if API request fails
- Error if request times out (60 seconds)
- Error if tools format is invalid
License
MIT
