@nadabaydoun/llm-router-core
v0.1.1
Published
Core LLM routing function using OpenRouter (OpenAI SDK).
Maintainers
Readme
@nadabaydoun/llm-router-core
Core LLM routing helper that picks a model and returns an answer using OpenRouter (via OpenAI SDK).
Install
npm install @nadabaydoun/llm-router-coreSet your OpenRouter API key:
$env:OPENROUTER_API_KEY = "your_key_here"Quick start (TypeScript)
import { createLLMRouter } from "@nadabaydoun/llm-router-core";
const router = createLLMRouter({
apiKey: process.env.OPENROUTER_API_KEY!,
// Optional: baseURL, defaultModel, etc.
});
async function main() {
const { model } = await router.route({
message: "Summarize this in 1 sentence: LLM routing demo.",
fastMode: false,
});
console.log("Selected model:", model);
}
main().catch(console.error);Environment
OPENROUTER_API_KEYis required (get one from https://openrouter.ai/).
Notes
- This package only returns the selected
model. Use that model in your app’s completion call. - If you re-publish, bump the version in
package.json(e.g., 0.1.1).
