llm-catalog
v1.5.0
Published
CLI tool to download LLM model lists from various providers - Developed with AI assistance
Maintainers
Readme
LLM Catalog CLI
CLI tool to download LLM model lists from various providers (OpenAI, Anthropic, Google Gemini) and save them as JSON files.
Quick Start
npx llm-catalogRequirements
- Node.js v18+
- At least one API key (set as environment variables):
OPENAI_API_KEY- for OpenAI modelsANTHROPIC_API_KEY- for Claude modelsGOOGLE_API_KEYorGEMINI_API_KEY- for Gemini models
Usage
Command Line Options
npx llm-catalog [options]
Options:
--output <filename> Save to file (default: stdout)
--provider <provider> Single: openai, anthropic, gemini
Multiple: openai,anthropic,gemini
--no-filter Skip interactive mode, output all models
(automatically enables quiet mode for pipelines)
-h, --help Show help message
-v, --version Show version informationExamples
# Interactive mode (default) - with progress messages and model selection
npx llm-catalog
# Save specific provider to file
npx llm-catalog --provider openai --no-filter --output models.json
# Pipeline usage - clean JSON output without progress messages
npx llm-catalog --provider gemini --no-filter | pbcopy
npx llm-catalog --provider openai --no-filter | jq '.OPENAI.data[].id'
# Compare: Interactive vs Pipeline mode
# Interactive: Shows 🚀 📥 ✓ 📊 🎉 messages + user prompts
# Pipeline: Clean JSON only, perfect for automation💡 Tip:
--no-filterautomatically suppresses all progress messages, making output perfect for piping tojq,pbcopy, or other tools
Environment Setup
Set up your API keys:
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
export GOOGLE_API_KEY="your-google-key"Supported Providers
- OpenAI - GPT-4, GPT-3.5-turbo, etc.
- Anthropic - Claude 3 series
- Google Gemini - Gemini Pro, etc.
Output
Creates a JSON file with model data from selected providers:
{
"OPENAI": {
"object": "list",
"data": [
{
"id": "gpt-4.1",
"object": "model",
"created": 1744316542,
"owned_by": "system"
}
]
},
"ANTHROPIC": {
"data": [
{
"type": "model",
"id": "claude-sonnet-4-20250514",
"display_name": "Claude Sonnet 4",
"created_at": "2025-05-22T00:00:00Z"
}
],
"has_more": false,
"first_id": "claude-opus-4-20250514",
"last_id": "claude-2.0"
},
"GEMINI": {
"models": [
{
"name": "models/gemini-2.5-pro",
"version": "2.5",
"displayName": "Gemini 2.5 Pro",
"description": "Stable release (June 17th, 2025) of Gemini 2.5 Pro",
"inputTokenLimit": 1048576,
"outputTokenLimit": 65536,
"supportedGenerationMethods": [
"generateContent",
"countTokens",
"createCachedContent",
"batchGenerateContent"
],
"temperature": 1,
"topP": 0.95,
"topK": 64,
"maxTemperature": 2,
"thinking": true
}
],
"nextPageToken": "Chttb2RlbHMvZ2VtaW5pLWVtYmVkZGluZy0wMDE="
}
}Troubleshooting
- No API keys detected: Check your environment variables
- API call failed: Verify your API keys and internet connection
Development
This project was developed with AI assistance using VS Code + Claude Sonnet 4. Code, comments, commits, and documentation were generated through AI collaboration.
Note: English expressions in this project may contain unnatural phrasing as they were written or translated through AI assistance.
License
ISC
