npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cclm

v1.0.1

Published

Claude Code with LM Studio - Use official Claude Code CLI with local LM Studio models

Downloads

55

Readme

CCLM - Claude Code with LM Studio

Use the official Claude Code CLI with local LM Studio models from anywhere in your terminal

Features

Official Claude Code Interface - Full compatibility with Claude Code CLI ✅ Local LM Studio Models - Use any model loaded in LM Studio ✅ Global Command - Run cclm from any directory ✅ Auto Proxy Management - Automatic start/stop of translation proxy ✅ Zero Configuration - Works out of the box with sensible defaults ✅ Configurable - Easy configuration via CLI commands

Prerequisites

  1. LM Studio - Download from https://lmstudio.ai/
  2. Node.js 18+ - Download from https://nodejs.org/

Installation

Global Installation (Recommended)

npm install -g cclm

Local Installation

git clone https://github.com/yourusername/cclm
cd cclm
npm install
npm link

Quick Start

  1. Start LM Studio

    • Open LM Studio
    • Load a model (e.g., Granite 4.0 H 1B, Llama 3.1 8B)
    • Go to "Local Server" tab
    • Click "Start Server"
  2. Run CCLM

    cclm

That's it! You're now using Claude Code with your local LM Studio model.

Usage

Basic Commands

# Start interactive session
cclm

# Show help
cclm --help

# Show version
cclm --version

# Pass arguments to Claude Code
cclm --model my-model

Configuration

# Show current configuration
cclm config show

# Set LM Studio model
cclm config set lmStudioModel granite-4.0-h-1b

# Set LM Studio URL (if using custom port)
cclm config set lmStudioUrl http://localhost:1234/v1

# Enable debug mode
cclm config set debug true

# Change proxy port
cclm config set proxyPort 3001

# Disable auto-start proxy
cclm config set autoStartProxy false

Configuration Options

| Key | Default | Description | |-----|---------|-------------| | lmStudioUrl | http://localhost:1234/v1 | LM Studio API URL | | lmStudioModel | local-model | Model name in LM Studio | | proxyPort | 3000 | Port for proxy server | | debug | false | Enable debug logging | | autoStartProxy | true | Auto-start proxy if not running |

How It Works

┌─────────────┐       ┌──────────┐       ┌───────────┐
│  cclm CLI   │ ─────>│  Proxy   │ ─────>│ LM Studio │
│  (Official) │ API   │  Server  │ API   │  Server   │
└─────────────┘       └──────────┘       └───────────┘
   Anthropic           Translator        OpenAI format
   format

CCLM consists of three parts:

  1. CCLM Wrapper (cclm command) - Manages setup and launches components
  2. Proxy Server - Translates between Anthropic API and LM Studio (OpenAI) API
  3. Claude Code CLI - Official Anthropic CLI (installed as dependency)

Features in Detail

Full Claude Code Compatibility

All official Claude Code features work:

  • ✅ File operations (Read, Write, Edit)
  • ✅ Search (Glob, Grep)
  • ✅ Terminal commands (Bash)
  • ✅ Task management
  • ✅ Agents and plugins
  • ✅ Streaming responses
  • ✅ Function/tool calling
  • ✅ MCP servers
  • ✅ Slash commands

Auto Proxy Management

The proxy server starts automatically when needed:

# First run - proxy starts automatically
$ cclm

[1/3] Checking LM Studio... ✓
[2/3] Checking proxy server... starting...
      ✓ Proxy server started
[3/3] Starting Claude Code... ✓

On subsequent runs, if the proxy is still running:

$ cclm

[1/3] Checking LM Studio... ✓
[2/3] Checking proxy server... ✓
[3/3] Starting Claude Code... ✓

Configuration Persistence

Your configuration is saved to config.json in the package directory and persists across updates.

Examples

Use Specific Model

# Configure once
cclm config set lmStudioModel llama-3.1-70b

# Then just run
cclm

Debug Mode

# Enable debug logging
cclm config set debug true

# Run with debug output
cclm

Custom Port

# If LM Studio uses a different port
cclm config set lmStudioUrl http://localhost:8080/v1

# Or if you want proxy on different port
cclm config set proxyPort 8000

Working Directory

CCLM works from any directory:

# Navigate to your project
cd ~/projects/my-app

# Run cclm
cclm

# Claude Code operates in current directory

Troubleshooting

LM Studio Not Detected

Error: LM Studio is not running or not accessible

Solution:

  1. Verify LM Studio is running
  2. Check that local server is enabled
  3. Test manually: curl http://localhost:1234/v1/models
  4. If using different port: cclm config set lmStudioUrl http://localhost:YOUR_PORT/v1

Proxy Won't Start

Error: Failed to start proxy server

Solution:

  1. Check if port 3000 is available: netstat -ano | findstr :3000 (Windows) or lsof -ti:3000 (Mac/Linux)
  2. Use different port: cclm config set proxyPort 3001
  3. Enable debug: cclm config set debug true and check logs

Command Not Found

Error: cclm: command not found

Solution:

# Verify installation
npm list -g cclm

# Reinstall globally
npm install -g cclm

# Or link locally
cd cclm
npm link

Model Not Responding

Solution:

  1. Verify model name matches LM Studio: cclm config set lmStudioModel your-exact-model-name
  2. Check LM Studio logs for errors
  3. Try smaller model or longer timeout
  4. Enable debug mode to see requests: cclm config set debug true

Advanced Usage

Use with Different LLM Providers

CCLM works with any OpenAI-compatible API:

# Ollama
cclm config set lmStudioUrl http://localhost:11434/v1

# LocalAI
cclm config set lmStudioUrl http://localhost:8080/v1

# Text Generation WebUI
cclm config set lmStudioUrl http://localhost:5000/v1

Programmatic Usage

import { spawn } from 'child_process';

const cclm = spawn('cclm', ['--model', 'my-model'], {
  env: process.env,
  stdio: 'inherit'
});

Multiple Configurations

# Save current config
cp ~/.cclm/config.json ~/.cclm/config-granite.json

# Switch configs
cp ~/.cclm/config-llama.json ~/.cclm/config.json
cclm

Comparison with Alternatives

| Feature | CCLM | Custom CLI | API Direct | |---------|------|------------|------------| | Official CLI | ✅ | ❌ | ❌ | | Full Features | ✅ | ⚠️ Limited | ❌ | | Auto Updates | ✅ | ❌ | ❌ | | Easy Setup | ✅ | ⚠️ Medium | ❌ | | LM Studio | ✅ | ✅ | ✅ | | Global Command | ✅ | ⚠️ Manual | N/A |

Performance

Typical performance with Granite 4.0 H 1B (1M context):

  • Startup: ~3-5 seconds
  • First response: ~2-5 seconds
  • Streaming: Real-time, word-by-word
  • Tool execution: <100ms proxy overhead
  • Memory: ~50MB (proxy) + ~2GB (LM Studio model)

Security & Privacy

  • ✅ Runs 100% locally - no data leaves your machine
  • ✅ No telemetry or tracking
  • ✅ Proxy only accessible on localhost
  • ✅ Dummy API key (not used externally)

Development

Build from Source

git clone <repository>
cd cclm-package
npm install
npm link

Run Tests

npm test

Debug

# Enable debug mode
cclm config set debug true

# Or set environment variable
DEBUG=true cclm

Changelog

1.0.0 (Initial Release)

  • Official Claude Code CLI integration
  • LM Studio proxy server
  • Auto proxy management
  • Configuration system
  • Global command installation

License

MIT

Credits

  • Claude Code - Official CLI by Anthropic
  • LM Studio - Local LLM hosting platform
  • CCLM - Integration layer and proxy server

Support

For issues, questions, or contributions:

  1. Check this README
  2. Enable debug mode: cclm config set debug true
  3. Review proxy logs
  4. Check LM Studio server status

Enjoy using Claude Code with LM Studio! 🚀