@iamharshil/aix-cli
v4.0.3
Published
Run Claude Code with local AI models via LM Studio or Ollama — no API keys, no cloud, complete privacy
Downloads
1,908
Maintainers
Readme
AIX
Run Claude Code with local AI models. No API keys. No cloud. Complete privacy.
Getting Started · Documentation · Contributing · Changelog
What is AIX?
AIX CLI is a bridge between local model servers and AI coding assistants. It connects LM Studio or Ollama to Claude Code — letting you use locally-running language models as the backend for your favorite AI dev tools.
No API keys. No cloud calls. No data leaving your machine.
┌──────────────────────────────────────────────────┐
│ $ aix-cli run │
│ │
│ ? Select model backend: Ollama │
│ ✔ Connected to Ollama │
│ ✔ Model selected: qwen2.5-coder:14b │
│ ✔ Launching Claude Code... │
│ │
│ Your code stays local. Always. │
└──────────────────────────────────────────────────┘Why AIX?
- 🔒 Privacy-first — All inference runs locally on your hardware. Your code never leaves your machine.
- 🔑 No API keys — No subscriptions, no usage limits, no cloud dependencies.
- 🚀 GPU-accelerated — Take advantage of your local GPU for fast inference.
- 🔀 Single provider — Claude Code is the only supported AI coding assistant.
- ⚡ Zero config — Just run
aix-cli runand start coding.
Compatibility notes
- LM Studio: Works natively with Claude Code via Anthropic-compatible API at
/v1. - Ollama: Use the
--nativeflag to leverage Ollama's built-inollama launch claudeintegration, which handles all API configuration automatically.
# Recommended for Ollama
aix-cli run --ollama --native -m qwen2.5-coder:14bGetting Started
Prerequisites
| Requirement | Description | | -------------------------------------------------------------------- | ------------------------------------------ | | Node.js ≥ 18 | JavaScript runtime | | LM Studio or Ollama | Local model server (at least one required) | | Claude Code | AI coding assistant |
Install
npm install -g @iamharshil/aix-cli-cli# Yarn
yarn global add @iamharshil/aix-cli-cli
# pnpm
pnpm add -g @iamharshil/aix-cli-cligit clone https://github.com/iamharshil/aix-cli-cli.git
cd aix-cli
npm install
npm run build
npm linkVerify
aix-cli doctorThis checks that LM Studio / Ollama, Claude Code, and your environment are properly configured.
Usage
aix-cli run — Start a coding session
The primary command. Launches Claude Code backed by a local model.
# Interactive — prompts for backend and model
aix-cli run
# Specify backend and model
aix-cli run -b ollama -m qwen2.5-coder:14b
aix-cli run -b lmstudio -m llama-3-8b
# Use Ollama's native Claude Code integration (recommended for Ollama)
aix-cli run -b ollama --native -m qwen2.5-coder:14b
# Global shortcuts
aix-cli run --ollama --native -m gemma4
# Pass a prompt directly
aix-cli run -b ollama -m qwen2.5-coder:14b -- "Refactor auth middleware"aix-cli init — Set up backend and model
Configure your preferred backend and load/select a model.
aix-cli init # Interactive setup
aix-cli init -b ollama -m qwen2.5-coder:14b # Ollama with specific model
aix-cli init -b lmstudio -m llama-3-8b -p claude # Full config in one commandaix-cli status — Check what's running
Shows status for both LM Studio and Ollama, including available and running models.
aix-cli statusaix-cli doctor — Infrastructure check
Checks that provider is running, port is accessible, model is available, and Claude config is correct.
aix-cli doctoraix-cli setup — One-command setup
Quick setup for first-time users. Detects installed providers and configures defaults.
aix-cli setup # Interactive setup
aix-cli setup --provider ollama # Use specific provider
aix-cli setup --force # Overwrite existing configaix-cli providers — Manage providers
List available providers or set a default.
aix-cli providers list # Show providers with status
aix-cli providers set ollama # Set default provideraix-cli models — List models
Fetch and display available models from a provider.
aix-cli models list --provider ollama
aix-cli models list --provider lmstudioaix-cli switch — Switch provider
Instantly switch providers without breaking Claude setup.
aix-cli switch ollama
aix-cli switch lmstudioaix-cli disconnect — Disconnect
Remove connection cleanly.
aix-cli disconnect claudeaix-cli fix — Fix issues
Fix infrastructure issues - suggest starting backends, correct ports, reset config, fix model.
aix-cli fixCommand Reference
| Command | Aliases | Description |
| ------------ | --------------- | ------------------------------------------------ |
| run | r | Run Claude Code with a local model |
| init | i, load | Set up backend, select model, configure provider |
| status | s, stats | Show active provider, tool, endpoint, model |
| doctor | d, check | Check infrastructure status |
| setup | | One-command default setup |
| providers | | List or set default provider |
| models | | List available models |
| switch | | Switch to a different provider |
| disconnect | | Disconnect from provider |
| fix | | Fix infrastructure issues |
| update | upgrade, u | Update AIX CLI to the latest version |
| config | c, settings | View, set, or reset CLI configurations |
Global Options
| Flag | Description |
| ---------------------- | ------------------------------------------- |
| -v, --version | Show version number |
| -h, --help | Display help |
| -b, --backend <name> | Model backend: lmstudio or ollama |
| -m, --model <name> | Model name or ID to use |
| -n, --native | Use Ollama's native Claude Code integration |
| -V, --verbose | Show verbose output |
Configuration
AIX stores its configuration in the OS-appropriate config directory:
| Platform | Path |
| -------- | ---------------------------------------- |
| macOS | ~/Library/Application Support/aix-cli/ |
| Linux | ~/.config/aix-cli/ |
| Windows | %APPDATA%\aix-cli\ |
Config File
{
"lmStudioUrl": "http://localhost",
"lmStudioPort": 1234,
"ollamaUrl": "http://localhost",
"ollamaPort": 11434,
"defaultTimeout": 30000,
"defaultBackend": "ollama",
"defaultProvider": "claude",
"model": "qwen2.5-coder:14b"
}Environment Variables
| Variable | Description | Default |
| ---------------- | ---------------------------------- | ------- |
| LM_STUDIO_PORT | Override the LM Studio server port | 1234 |
How It Works
┌───────────────────┐ ┌───────────────────┐
│ LM Studio │ │ Ollama │
│ (port 1234) │ │ (port 11434) │
└────────┬──────────┘ └────────┬───────────┘
│ │
REST API REST API
│ │
└───────────┬────────────┘
│
┌────────┴──────────┐
│ AIX CLI │
│ backend routing │
│ model selection │
│ config mgmt │
└────────┬──────────┘
│
▼
┌──────────────┐
│ Claude Code │
│ --model X │
└──────────────┘- LM Studio or Ollama runs a local inference server with an OpenAI-compatible API.
- AIX CLI discovers available models, manages configuration, and orchestrates the connection.
- Claude Code receives the model endpoint and runs as it normally would — except fully local.
Troubleshooting
- Open LM Studio
- Navigate to the Server tab (left sidebar)
- Click Start Server
- Confirm with
aix-cli status
- Install Ollama from ollama.com
- Start the server:
ollama serve - Pull a model:
ollama pull qwen2.5-coder:14b - Confirm with
aix-cli status
LM Studio: Open LM Studio → Search tab → download a model.
Ollama: Run ollama pull <model> to download a model (e.g., ollama pull llama3.2).
Then run aix-cli init to select and configure.
Check that the correct port is being used:
- LM Studio defaults to port
1234 - Ollama defaults to port
11434
You can configure custom ports in your AIX config file (path shown by aix-cli doctor).
Install Claude Code globally:
npm install -g @anthropic-ai/claude-codeThen re-run aix-cli doctor to confirm.
Security & Privacy
AIX is designed around a simple principle: your code never leaves your machine.
- ✅ All AI inference runs locally via LM Studio or Ollama
- ✅ No telemetry, analytics, or tracking of any kind
- ✅ No outbound network calls (except to
localhost) - ✅ No API keys or accounts required
- ✅ Fully open-source — audit the code yourself
Found a vulnerability? Please report it responsibly via our Security Policy.
Contributing
Contributions are welcome! See CONTRIBUTING.md for guidelines on how to get started.
git clone https://github.com/iamharshil/aix-cli-cli.git
cd aix-cli
npm install
npm run dev # Run in development mode
npm test # Run tests
npm run lint # LintRelated Projects
- LM Studio — Run local AI models with a visual interface
- Ollama — Run large language models locally
- Claude Code — Anthropic's AI coding assistant
