@camelai/tldr
v0.1.1
Published
CLI to summarize text with OpenAI, Anthropic, or OpenRouter
Maintainers
Readme
tldr
A tiny CLI that summarizes text using OpenAI, Anthropic, or OpenRouter.
Install
Run locally
npm install
node ./bin/tldr.js --helpInstall globally from npm
npm install -g @camelai/tldrInstall from this repo
npm install -g github:qaml-ai/tldrUsage
As a JavaScript library
import { summarize, resolveConfig } from '@camelai/tldr';
const config = resolveConfig({ provider: 'openai' });
const summary = await summarize('Long text to summarize...', config);
console.log(config.provider, config.model);
console.log(summary);You can also pass options directly:
import { summarize } from '@camelai/tldr';
const summary = await summarize('Long text to summarize...', {
provider: 'openrouter',
model: 'google/gemini-3.1-flash-lite-preview',
system: 'Summarize in 3 bullet points.'
});The library uses the same environment variables as the CLI for API keys and default model selection.
Summarize a prompt:
tldr "Summarize the key points from this meeting note..."Summarize stdin:
cat notes.txt | tldrChoose a provider explicitly:
tldr --provider openai "text"
tldr --provider anthropic "text"
tldr --provider openrouter "text"Override the model:
tldr --provider openai --model gpt-5.4-nano "text"
tldr --provider anthropic --model claude-4-haiku-20250305 "text"
tldr --provider openrouter --model google/gemini-3.1-flash-lite-preview "text"Override the system prompt:
tldr --system "Summarize in 3 bullet points." "text"Options
-p, --provider <name>:openai,anthropic, oropenrouter-m, --model <model>: override the model-s, --system <prompt>: override the summarization system prompt-h, --help: show help
Environment variables
API keys:
OPENAI_API_KEYANTHROPIC_API_KEYOPENROUTER_API_KEY
Defaults:
TLDR_SYSTEM_PROMPTTLDR_OPENAI_MODELdefault:gpt-5.4-nanoTLDR_ANTHROPIC_MODELdefault:claude-4-haiku-20250305TLDR_OPENROUTER_MODELdefault:google/gemini-3.1-flash-lite-preview
Legacy env vars are also supported:
SUMMARIZE_SYSTEM_PROMPTSUMMARIZE_OPENAI_MODELSUMMARIZE_ANTHROPIC_MODELSUMMARIZE_OPENROUTER_MODEL
Default model selection
If --provider is omitted, tldr picks the first available provider in this order:
OPENAI_API_KEYANTHROPIC_API_KEYOPENROUTER_API_KEY
Library API
summarize(input, options)
Returns a summary string.
Options:
provider:openai,anthropic, oropenroutermodel: optional model overridesystem: optional system prompt overrideapiKey: optional explicit API key override
resolveConfig(options)
Resolves the effective provider, model, and system prompt using arguments and environment defaults.
detectProvider(explicitProvider)
Returns the selected provider based on an explicit provider or env-var priority.
getDefaultModel(provider)
Returns the default model for a provider.
Publish to npm
When you're ready:
npm publishBecause publishConfig.access is set to public, this package is ready for public npm publishing.
License
MIT
