coe-ai
v1.0.0
Published
COE AI LLM Inference Client - A TypeScript/JavaScript client for the COE AI LLM API
Downloads
5
Maintainers
Readme
COE AI - NPM Package
A TypeScript/JavaScript client for the COE AI LLM inference API.
Note: This API is only accessible from UPES's internal network (UPESNET).
Installation
npm install coe-aiQuick Start
import { LLMinfer } from 'coe-ai';
// Initialize the client
const llm = new LLMinfer({
apiKey: 'your-api-key', // Get from https://coeai.ddn.upes.ac.in
host: 'http://10.9.6.165:8000' // optional, this is the default
});
// Generate text
const response = await llm.generate({
model: 'tinyllama:latest',
prompt: 'Explain machine learning in simple terms'
});
console.log(response.choices?.[0]?.message?.content);API Reference
LLMinfer
The main client class.
Constructor
new LLMinfer({
apiKey: string, // Required: Your API key
host?: string // Optional: Server URL (default: http://10.9.6.165:8000)
})Methods
generate(options: GenerateOptions): Promise<GenerateResponse>
Generate text from a prompt or messages.
const response = await llm.generate({
model: 'tinyllama:latest',
prompt: 'Hello!',
maxTokens: 512, // optional
temperature: 0.7, // optional
topP: 1.0, // optional
contextWindow: 2048, // optional
stream: false // optional
});chat(messages: Message[], model: string, options?): Promise<GenerateResponse>
Convenience method for chat-style interactions.
const response = await llm.chat(
[
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is AI?' }
],
'tinyllama:latest'
);list_models(): Promise<string[]>
Get list of available models.
const models = await llm.list_models();
console.log('Available models:', models);Error Handling
import { LLMinfer, AuthenticationError, ModelNotFoundError, InferenceError } from 'coe-ai';
try {
const response = await llm.generate({ model: 'invalid', prompt: 'test' });
} catch (error) {
if (error instanceof AuthenticationError) {
console.error('Invalid API key');
} else if (error instanceof ModelNotFoundError) {
console.error('Model not found');
} else if (error instanceof InferenceError) {
console.error('Inference failed:', error.message);
}
}Available Models
Use llm.list_models() to get the current list of available models.
Common models include:
tinyllama:latest- Fast, lightweightllama4:16x17b- Vision-capablellama3.2-vision:11b- Vision-capable
Requirements
- Node.js >= 16.0.0
- Access to UPES internal network (UPESNET)
License
MIT
