npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@iamharshil/aix-cli

v4.0.3

Published

Run Claude Code with local AI models via LM Studio or Ollama — no API keys, no cloud, complete privacy

Downloads

1,908

Readme

AIX

Run Claude Code with local AI models. No API keys. No cloud. Complete privacy.

npm version Downloads License CI Node

Getting Started · Documentation · Contributing · Changelog


What is AIX?

AIX CLI is a bridge between local model servers and AI coding assistants. It connects LM Studio or Ollama to Claude Code — letting you use locally-running language models as the backend for your favorite AI dev tools.

No API keys. No cloud calls. No data leaving your machine.

┌──────────────────────────────────────────────────┐
│  $ aix-cli run                                   │
│                                                  │
│  ? Select model backend: Ollama                  │
│  ✔ Connected to Ollama                           │
│  ✔ Model selected: qwen2.5-coder:14b             │
│  ✔ Launching Claude Code...                      │
│                                                  │
│  Your code stays local. Always.                  │
└──────────────────────────────────────────────────┘

Why AIX?

  • 🔒 Privacy-first — All inference runs locally on your hardware. Your code never leaves your machine.
  • 🔑 No API keys — No subscriptions, no usage limits, no cloud dependencies.
  • 🚀 GPU-accelerated — Take advantage of your local GPU for fast inference.
  • 🔀 Single provider — Claude Code is the only supported AI coding assistant.
  • Zero config — Just run aix-cli run and start coding.

Compatibility notes

  • LM Studio: Works natively with Claude Code via Anthropic-compatible API at /v1.
  • Ollama: Use the --native flag to leverage Ollama's built-in ollama launch claude integration, which handles all API configuration automatically.
# Recommended for Ollama
aix-cli run --ollama --native -m qwen2.5-coder:14b

Getting Started

Prerequisites

| Requirement | Description | | -------------------------------------------------------------------- | ------------------------------------------ | | Node.js ≥ 18 | JavaScript runtime | | LM Studio or Ollama | Local model server (at least one required) | | Claude Code | AI coding assistant |

Install

npm install -g @iamharshil/aix-cli-cli
# Yarn
yarn global add @iamharshil/aix-cli-cli

# pnpm
pnpm add -g @iamharshil/aix-cli-cli
git clone https://github.com/iamharshil/aix-cli-cli.git
cd aix-cli
npm install
npm run build
npm link

Verify

aix-cli doctor

This checks that LM Studio / Ollama, Claude Code, and your environment are properly configured.


Usage

aix-cli run — Start a coding session

The primary command. Launches Claude Code backed by a local model.

# Interactive — prompts for backend and model
aix-cli run

# Specify backend and model
aix-cli run -b ollama -m qwen2.5-coder:14b
aix-cli run -b lmstudio -m llama-3-8b

# Use Ollama's native Claude Code integration (recommended for Ollama)
aix-cli run -b ollama --native -m qwen2.5-coder:14b

# Global shortcuts
aix-cli run --ollama --native -m gemma4

# Pass a prompt directly
aix-cli run -b ollama -m qwen2.5-coder:14b -- "Refactor auth middleware"

aix-cli init — Set up backend and model

Configure your preferred backend and load/select a model.

aix-cli init                                    # Interactive setup
aix-cli init -b ollama -m qwen2.5-coder:14b     # Ollama with specific model
aix-cli init -b lmstudio -m llama-3-8b -p claude # Full config in one command

aix-cli status — Check what's running

Shows status for both LM Studio and Ollama, including available and running models.

aix-cli status

aix-cli doctor — Infrastructure check

Checks that provider is running, port is accessible, model is available, and Claude config is correct.

aix-cli doctor

aix-cli setup — One-command setup

Quick setup for first-time users. Detects installed providers and configures defaults.

aix-cli setup                    # Interactive setup
aix-cli setup --provider ollama   # Use specific provider
aix-cli setup --force           # Overwrite existing config

aix-cli providers — Manage providers

List available providers or set a default.

aix-cli providers list              # Show providers with status
aix-cli providers set ollama        # Set default provider

aix-cli models — List models

Fetch and display available models from a provider.

aix-cli models list --provider ollama
aix-cli models list --provider lmstudio

aix-cli switch — Switch provider

Instantly switch providers without breaking Claude setup.

aix-cli switch ollama
aix-cli switch lmstudio

aix-cli disconnect — Disconnect

Remove connection cleanly.

aix-cli disconnect claude

aix-cli fix — Fix issues

Fix infrastructure issues - suggest starting backends, correct ports, reset config, fix model.

aix-cli fix

Command Reference

| Command | Aliases | Description | | ------------ | --------------- | ------------------------------------------------ | | run | r | Run Claude Code with a local model | | init | i, load | Set up backend, select model, configure provider | | status | s, stats | Show active provider, tool, endpoint, model | | doctor | d, check | Check infrastructure status | | setup | | One-command default setup | | providers | | List or set default provider | | models | | List available models | | switch | | Switch to a different provider | | disconnect | | Disconnect from provider | | fix | | Fix infrastructure issues | | update | upgrade, u | Update AIX CLI to the latest version | | config | c, settings | View, set, or reset CLI configurations |

Global Options

| Flag | Description | | ---------------------- | ------------------------------------------- | | -v, --version | Show version number | | -h, --help | Display help | | -b, --backend <name> | Model backend: lmstudio or ollama | | -m, --model <name> | Model name or ID to use | | -n, --native | Use Ollama's native Claude Code integration | | -V, --verbose | Show verbose output |


Configuration

AIX stores its configuration in the OS-appropriate config directory:

| Platform | Path | | -------- | ---------------------------------------- | | macOS | ~/Library/Application Support/aix-cli/ | | Linux | ~/.config/aix-cli/ | | Windows | %APPDATA%\aix-cli\ |

Config File

{
  "lmStudioUrl": "http://localhost",
  "lmStudioPort": 1234,
  "ollamaUrl": "http://localhost",
  "ollamaPort": 11434,
  "defaultTimeout": 30000,
  "defaultBackend": "ollama",
  "defaultProvider": "claude",
  "model": "qwen2.5-coder:14b"
}

Environment Variables

| Variable | Description | Default | | ---------------- | ---------------------------------- | ------- | | LM_STUDIO_PORT | Override the LM Studio server port | 1234 |


How It Works

        ┌───────────────────┐    ┌───────────────────┐
        │    LM Studio      │    │      Ollama        │
        │  (port 1234)      │    │  (port 11434)      │
        └────────┬──────────┘    └────────┬───────────┘
                 │                        │
            REST API                 REST API
                 │                        │
                 └───────────┬────────────┘
                             │
                    ┌────────┴──────────┐
                    │     AIX CLI       │
                    │  backend routing  │
                    │  model selection  │
                    │  config mgmt      │
                    └────────┬──────────┘
                             │
                             ▼
                    ┌──────────────┐
                    │  Claude Code │
                    │  --model X   │
                    └──────────────┘
  1. LM Studio or Ollama runs a local inference server with an OpenAI-compatible API.
  2. AIX CLI discovers available models, manages configuration, and orchestrates the connection.
  3. Claude Code receives the model endpoint and runs as it normally would — except fully local.

Troubleshooting

  1. Open LM Studio
  2. Navigate to the Server tab (left sidebar)
  3. Click Start Server
  4. Confirm with aix-cli status
  1. Install Ollama from ollama.com
  2. Start the server: ollama serve
  3. Pull a model: ollama pull qwen2.5-coder:14b
  4. Confirm with aix-cli status

LM Studio: Open LM Studio → Search tab → download a model.

Ollama: Run ollama pull <model> to download a model (e.g., ollama pull llama3.2).

Then run aix-cli init to select and configure.

Check that the correct port is being used:

  • LM Studio defaults to port 1234
  • Ollama defaults to port 11434

You can configure custom ports in your AIX config file (path shown by aix-cli doctor).

Install Claude Code globally:

npm install -g @anthropic-ai/claude-code

Then re-run aix-cli doctor to confirm.


Security & Privacy

AIX is designed around a simple principle: your code never leaves your machine.

  • ✅ All AI inference runs locally via LM Studio or Ollama
  • ✅ No telemetry, analytics, or tracking of any kind
  • ✅ No outbound network calls (except to localhost)
  • ✅ No API keys or accounts required
  • ✅ Fully open-source — audit the code yourself

Found a vulnerability? Please report it responsibly via our Security Policy.


Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines on how to get started.

git clone https://github.com/iamharshil/aix-cli-cli.git
cd aix-cli
npm install
npm run dev    # Run in development mode
npm test       # Run tests
npm run lint   # Lint

Related Projects

  • LM Studio — Run local AI models with a visual interface
  • Ollama — Run large language models locally
  • Claude Code — Anthropic's AI coding assistant

License

MIT © Harshil