fake-list-llm
v1.0.0
Published
Generate lists using OpenAI-compatible APIs
Maintainers
Readme
Fake List Generator
A Node.js CLI tool that generates lists using OpenAI-compatible APIs (like OpenRouter).
Disclosure: This entire project was generated by AI using Claude Sonnet 4. All code, documentation, and project structure were created automatically based on the user's requirements.
Installation
npm installUsage
Basic Usage
fake-list-llm 20 "band names"This will generate 20 band names using the default model (qwen/qwen-turbo) and OpenRouter endpoint.
Command Line Options
-m, --model <model>: Model to use (overrides config file)-e, --endpoint <url>: API endpoint URL (overrides config file)-k, --api-key <key>: API key (overrides config file and environment variable)-p, --prompt <prompt>: Custom prompt template (overrides config file)--verbose: Enable verbose output (overrides config file)-c, --config <path>: Path to custom config file--init-config: Create default user config file--show-config-paths: Show config file search paths
Examples
# Generate 10 colors
fake-list-llm 10 "colors"
# Use a different model
fake-list-llm 15 "animals" --model "anthropic/claude-3-haiku"
# Use a custom endpoint
fake-list-llm 5 "fruits" --endpoint "https://api.openai.com/v1"
# Use a custom prompt
fake-list-llm 8 "programming languages" --prompt "List {count} {concept} that are popular in 2024:"
# Verbose output
fake-list-llm 12 "desserts" --verboseConfiguration Files
The tool supports configuration files in TOML format with proper layering:
- System config (read-only, for administrators)
- User config (your personal settings)
- Override config (specified with
--config) - Command line options (highest priority)
Config File Locations
The tool follows XDG Base Directory specification:
Linux:
- System:
/etc/xdg/fake-list-llm/config.toml - User:
~/.config/fake-list-llm/config.toml
macOS:
- System:
/Library/Preferences/fake-list-llm/config.toml - User:
~/Library/Preferences/fake-list-llm/config.toml
Windows:
- System:
C:\ProgramData\fake-list-llm\config.toml - User:
%APPDATA%\fake-list-llm\config.toml
Creating Your Config File
# Create default user config file
fake-list-llm --init-config
# Show where config files are located
fake-list-llm --show-config-pathsConfiguration Options
All options can be set in config files:
# AI model to use
model = "qwen/qwen-turbo"
# API endpoint URL
endpoint = "https://openrouter.ai/api/v1"
# API key (leave empty to use environment variable)
# apiKey = "your-api-key-here"
# Default prompt template
prompt = "Generate a list of {count} {concept}. Each item should be on a new line, numbered from 1 to {count}."
# Enable verbose output by default
verbose = falseEnvironment Variables
Set your OpenRouter API key:
export OPENROUTER_API_KEY="your-api-key-here"Features
- Streaming responses: Results are streamed to the terminal as they're generated
- Flexible configuration: Customizable model, endpoint, API key, and prompt via config files or CLI
- Configuration layering: System → User → Override → CLI options (highest priority)
- XDG-compliant paths: Follows standard configuration file locations across platforms
- OpenAI-compatible: Works with any OpenAI-compatible API endpoint
- Error handling: Clear error messages for common issues
- Verbose mode: Optional detailed output for debugging
Requirements
- Node.js 16.0.0 or higher
- Valid API key for your chosen provider
