rola-ai
v1.0.0
Published
TypeScript client library for RolaAI with OpenAI-compatible API
Maintainers
Readme
RolaAI TypeScript Client
A TypeScript client library for RolaAI that provides an OpenAI-compatible API interface.
Installation
npm install rola-aiUsage
Basic Usage
const RolaAI = require('rola-ai');
// Initialize with your API key
const client = new RolaAI('your-api-key');
// Or use default localhost endpoint without API key
const client = new RolaAI(); // Uses http://localhost:20128/v1 by defaultChat Completions
// List available models
const models = await client.listModels();
console.log(models);
// Create a chat completion
const response = await client.chat().completions([
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
], {
model: 'if/qwen3-coder-plus',
temperature: 0.7
});
console.log(response.choices[0].message.content);Legacy Completions
const response = await client.completions().create(
'Complete this sentence: Hello',
{
model: 'if/qwen3-coder-plus',
max_tokens: 100
}
);
console.log(response.choices[0].text);Embeddings
try {
const response = await client.embeddings().create(
['Hello world', 'Embeddings are useful'],
{ model: 'if/qwen3-coder-plus' }
);
console.log(response.data);
} catch (error) {
console.log('Embeddings not supported by this endpoint:', error.message);
}Configuration
// Custom endpoint and API key
const client = new RolaAI('your-api-key', 'https://your-rolaai-endpoint.com/v1');
// With default temperature and max tokens
const client = new RolaAI('your-api-key', 'https://your-rolaai-endpoint.com/v1', {
temperature: 0.5,
maxTokens: 256
});Features
- TypeScript Support: Includes type definitions for better development experience
- OpenAI Compatible: Same API interface as OpenAI for easy migration
- Multiple Endpoints: Works with 9Router and similar proxy services
- Streaming Support: Supports streaming responses for real-time applications
- Promise-based: Uses async/await for modern JavaScript development
Project Structure
rola-ai/
├── src/ # TypeScript source files
│ └── RolaAI.ts # Main implementation
├── dist/ # Compiled JavaScript output
│ ├── RolaAI.js # Compiled JavaScript
│ └── RolaAI.d.ts # TypeScript definitions
├── tests/ # Test files
│ ├── test.js # Main tests
│ ├── ts-test.js # TypeScript-specific tests
│ └── build-test.js # Build validation tests
├── examples/ # Example usage files
│ └── example.js # Example implementation
├── docs/ # Documentation
├── package.json # Package configuration
├── tsconfig.json # TypeScript configuration
└── README.md # Project documentationAPI Reference
RolaAI(apiKey?, baseURL?, options?)
apiKey(string, optional): Your API key. Defaults to process.env.ROLAAI_API_KEYbaseURL(string, optional): API base URL. Defaults to http://localhost:20128/v1options(object, optional): Additional configuration optionstemperature(number): Default temperature (0.0-2.0)maxTokens(number): Default max tokens
client.listModels()
Returns a Promise that resolves to a list of available models.
client.chat(temperature?, maxTokens?)
Returns a chat interface with configurable defaults.
temperature(number, optional): Default temperature for chat completions (0.0-2.0)maxTokens(number, optional): Default max tokens for chat completions
client.chat().completions(messages, options)
messages(array): Array of message objects with role and contentoptions(object, optional): Additional options like model, temperature, etc.
client.completions(model?, temperature?, maxTokens?)
Returns a completions interface with configurable defaults for legacy compatibility.
client.completions().create(prompt, options)
prompt(string): The prompt to completeoptions(object, optional): Additional options
client.embeddings(model?)
Returns an embeddings interface (not supported by 9Router).
client.embeddings().create(input, options)
input(string or array): Text or array of texts to embedoptions(object, optional): Additional options
client.streamChatCompletions(messages, options)
Asynchronous generator for streaming chat completions.
Building from Source
# Install dependencies
npm install
# Build TypeScript
npm run build
# Run tests
npm test
# Run all tests (main, TypeScript, and build tests)
npm run test-all
# Build and watch for changes
npm run devLicense
MIT
