@yarlisai/router
v0.1.0
Published
Multi-model AI router with pluggable strategies — OpenAI, Anthropic, Google, Ollama. Part of the Yarlis AI ecosystem.
Maintainers
Readme
@yarlisai/router
Multi-model AI router with pluggable strategies for the Yarlis AI ecosystem. Route requests across OpenAI, Anthropic, Google, and Ollama with cost, speed, quality, or local-first strategies.
Install
npm install @yarlisai/router @yarlisai/contractsQuick Start
import { YarlisRouter, createRouter } from "@yarlisai/router";
import { createYPID } from "@yarlisai/contracts";
const router = createRouter({
providers: [
{
provider: "anthropic",
apiKey: process.env.ANTHROPIC_API_KEY,
models: [
{ id: "claude-sonnet-4-6", provider: "anthropic", capabilities: ["chat", "code", "reasoning"], tier: "powerful", contextWindow: 200000, costPer1kInput: 0.003, costPer1kOutput: 0.015, supportsTools: true, supportsVision: true },
],
enabled: true,
},
{
provider: "ollama",
baseUrl: "http://localhost:11434/v1",
models: [
{ id: "llama3", provider: "ollama", capabilities: ["chat", "code"], tier: "local", contextWindow: 8192, costPer1kInput: 0, costPer1kOutput: 0 },
],
enabled: true,
},
],
defaultStrategy: "local-first",
fallback: "claude-sonnet-4-6",
});
const response = await router.route({
ypid: createYPID("rtm", "task"),
messages: [{ role: "user", content: "Hello!" }],
});Strategies
- cost — Pick the cheapest model that meets the requested capability
- speed — Pick the fastest tier (fast or local)
- quality — Pick the most powerful model
- local-first — Prefer Ollama/local models, fall back to API providers
License
MIT
