llm-launcher
v0.1.0
Published
Cross-platform CLI launcher for LLM agents (Aider) with Ollama and OpenRouter support
Maintainers
Readme
LLM Launcher
Cross-platform CLI launcher for LLM agents (Aider) with support for Ollama (local) and OpenRouter (cloud) models.
Features
- 🚀 Cross-platform (Windows, macOS, Linux)
- 🤖 Support for Ollama local models
- ☁️ Support for OpenRouter cloud models
- 🎨 Interactive model selection with nice CLI interface
- ⚙️ Automatic dependency checking
- 🔐 Environment variable support via
.envfiles - 📦 TypeScript-based for type safety
Prerequisites
- Node.js >= 16
- Aider installed (
pip install aider-chat) - Ollama (optional, for local models)
- OpenRouter API key (optional, for cloud models)
Installation
Global Installation (Recommended)
npm install -g llm-launcherThen run from anywhere:
llm
# or
llm-launcherLocal Development
git clone <repository-url>
cd llm-launcher
npm install
npm run build
npm linkQuick Start (without installing)
npx llm-launcherUsage
Basic Usage
Simply run the command and follow the interactive prompts:
llmThe launcher will:
- Check if Aider is installed
- Prompt you to select a provider (Ollama or OpenRouter)
- Show available models and let you select one
- Launch Aider with your selected model
Using Ollama (Local Models)
- Select "Ollama (Local models)" from the provider menu
- The launcher will automatically start Ollama if it's not running
- Select from your locally pulled models
- Aider will launch with the selected model
First-time Ollama users:
# Pull a model first
ollama pull llama3.3
# or
ollama pull qwen2.5-coder
# or
ollama pull deepseek-coderUsing OpenRouter (Cloud Models)
- Select "OpenRouter (Cloud API)" from the provider menu
- Enter your OpenRouter API key (or set via environment variable)
- Select from popular models or enter a custom model name
- Aider will launch with the selected model
Get an OpenRouter API key: https://openrouter.ai
Configuration
Environment Variables
Create a .env file in your project directory or set environment variables:
# OpenRouter API key
OPENROUTER_API_KEY=your_api_key_here
# Ollama API base (optional, defaults to http://localhost:11434)
OLLAMA_API_BASE=http://localhost:11434Supported Models
Ollama
Any model you've pulled locally via ollama pull <model-name>
OpenRouter
- Anthropic Claude (3.5 Sonnet, 3 Opus, 3 Sonnet, 3 Haiku)
- OpenAI GPT (GPT-4 Turbo, GPT-4, GPT-3.5 Turbo)
- Google Gemini (Pro 1.5, Pro)
- Meta Llama (3.1 70B, 3.1 8B)
- Mistral AI (Large, Medium)
- Qwen (2.5 72B)
- Custom models (enter any OpenRouter model identifier)
Development
Build
npm run buildWatch mode
npm run watchRun in development
npm run devPublishing to npm
First-time Setup
- Create an npm account at https://www.npmjs.com/signup
- Login to npm from the command line:
npm loginPublishing a New Version
- Update the version in
package.json(or use npm version):
# Patch release (0.1.0 -> 0.1.1)
npm version patch
# Minor release (0.1.0 -> 0.2.0)
npm version minor
# Major release (0.1.0 -> 1.0.0)
npm version major- Publish to npm:
npm publishThe prepare script automatically builds the project before publishing.
Publishing a Beta/Pre-release
npm version prerelease --preid=beta
npm publish --tag betaUsers can install beta versions with:
npm install -g llm-launcher@betaArchitecture
src/
├── index.ts # Main entry point and provider selection
├── types.ts # TypeScript type definitions
├── utils.ts # Utility functions (process checking, command execution)
├── ollama.ts # Ollama provider implementation
└── openrouter.ts # OpenRouter provider implementationWhy TypeScript?
- Type safety catches errors at compile time
- Better IDE support and autocomplete
- Self-documenting code with interfaces
- Easier refactoring and maintenance
- Still compiles to standard JavaScript
Troubleshooting
"Aider is not installed"
Install Aider:
pip install aider-chat"Ollama is not installed"
Download and install Ollama from: https://ollama.ai
"No models found in Ollama"
Pull a model first:
ollama pull llama3.3Windows-specific issues
Make sure you're running in PowerShell or Command Prompt with proper permissions.
License
MIT
Author
Thomas Powell
Contributing
Contributions welcome! Please feel free to submit issues or pull requests.
