cckit
v0.3.6
Published
Code Kit for Claude Model Switching - Support 智谱LLM, MiniMax, Kimi, Kuaishou StreamLake, ZenMux.ai
Downloads
328
Maintainers
Readme
cckit (Node.js Edition)
Code Kit for Claude Model Switching - Support 智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, 京东云 Coding Plan, 百度云 Coding Plan, 小米 Token Plan, ZenMux.ai, and official Claude
A CLI tool for managing and switching between different Claude model providers and their configurations. Features support for model gateways like ZenMux.ai and Kuaishou StreamLake that provide unified access to multiple AI providers.
Features
- Multiple Provider Support: Configure and switch between 智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, 阿里云 Coding Plan, 腾讯云 Coding Plan, 讯飞 Coding Plan, 京东云 Coding Plan, 百度云 Coding Plan, 小米 Token Plan, ZenMux.ai, official Claude, and custom providers
- Multi-Tool Support: Manage providers for Claude Code, OpenCode, and Codex CLI
- Easy Configuration: Simple command-line interface for managing API keys and settings
- Claude Integration: Automatically updates Claude Code configuration files
- Export/Import: Backup and restore your provider configurations
- Connection Testing: Verify provider connectivity before switching
- Multi-Language Support: English and Chinese localization
Installation
# npm
npm install -g cckit
# pnpm
pnpm add -g cckit
# yarn
yarn global add cckitQuick Start (Interactive Mode)
# Start interactive configuration wizard (recommended for first-time users)
cckit interactiveInteractive Switch Mode
cckit switch supports interactive mode with search when no provider argument is given:
# Interactive mode - search and select from list
cckit switch
# Direct mode - with provider argument
cckit switch zhipuInteractive switch supports keyword search to quickly filter providers:
? 搜索或选择 Provider >> mini
MiniMax Coding Plan - MiniMax-M2.5
阿里云 Coding Plan - MiniMax-M2.5Interactive Mode Details
The interactive wizard (cckit interactive) provides a user-friendly, step-by-step configuration experience with search and arrow key selection:
Step-by-Step Guide
▸ Step 1: 选择要配置的 AI 提供商Type to search or use ↑/↓ arrow keys to select from:
- Preset Providers: 智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, ZenMux, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, 阿里云 Coding Plan, 腾讯云 Coding Plan, 讯飞 Coding Plan, 京东云 Coding Plan, 百度云 Coding Plan, 小米 Token Plan, Claude
- Custom Provider: For local LLMs (Ollama, LocalAI, LM Studio) or any OpenAI-compatible API
If existing providers of the same type are found, you'll be asked to edit an existing one or create a new one. For custom providers, existing custom providers are listed for selection.
▸ Step 2: 配置 / 编辑 [Provider Name]- Enter your API key
- Editing existing: Current key is shown masked, press Enter to keep unchanged
- For custom providers: Enter the base URL (e.g.,
http://localhost:11434/v1)- Editing existing: Current value pre-filled, press Enter to keep
▸ Step 3: 设置模型- Enter the model name (default value provided based on provider)
- Multiple models can be entered, separated by commas
- Editing existing: Current model list pre-filled, press Enter to keep
▸ Step 4: 配置模型类型 (可选)- Optional: Configure different models for Sonnet/Haiku/Opus model tiers
- Each tier can be skipped to use the default model
- This sets
ANTHROPIC_DEFAULT_SONNET_MODEL,ANTHROPIC_DEFAULT_HAIKU_MODEL,ANTHROPIC_DEFAULT_OPUS_MODEL
▸ Step 5: 保存配置- Configuration saved to
~/.cckit/config.json
▸ Step 6: 测试连接- Optional: Test the connection to verify API key and endpoint
▸ Step 7: 切换 Provider- Optional: Switch to the newly configured provider immediately
Features
- Search & Filter: Type keywords to quickly find providers in both
interactiveandswitchmodes - Arrow Key Navigation: Easy selection with keyboard
- Smart Defaults: Pre-filled model names based on provider
- Edit Existing Providers: Detects existing configs early, pre-fills all fields with current values, press Enter to skip any field
- Custom Provider Selection: Lists existing custom providers for quick editing
- Connection Testing: Verify setup before switching
- Immediate Switch: Option to activate the provider right away
Example Session (Create New Provider)
$ cckit interactive
___ ___ / ___ ( ) __ ___
// ) ) // ) ) //\ \ / / / /
// // // \ \ / / / /
((____ ((____ // \ \ / / / /
交互式配置向导
▸ Step 1: 选择要配置的 AI 提供商
? 搜索或选择提供商 >> zhipu
智谱 Coding Plan (GLM)
▸ Step 2: 配置 智谱 Coding Plan
? 请输入 API Key: [your-api-key]
▸ Step 3: 设置模型
? 请输入模型名称 (多个模型用逗号分隔): glm-5, kimi-k2.5, MiniMax-M2.5
▸ Step 4: 配置模型类型 (可选)
提示: 可为 Sonnet/Haiku/Opus 模型类型配置不同的模型,不配置则使用默认模型
? 是否为不同模型类型 (Sonnet/Haiku/Opus) 配置不同的默认模型? Yes
? 选择 Sonnet 模型
❯ 跳过 (使用默认模型)
glm-5
kimi-k2.5
MiniMax-M2.5
? 选择 Haiku 模型
❯ 跳过 (使用默认模型)
glm-5
kimi-k2.5
MiniMax-M2.5
? 选择 Opus 模型
❯ 跳过 (使用默认模型)
glm-5
kimi-k2.5
MiniMax-M2.5
模型类型配置:
Sonnet: glm-5
Haiku: kimi-k2.5
Opus: MiniMax-M2.5
▸ Step 5: 保存配置
✓ 配置已保存 (id: zhipu_xxx)
Provider: 智谱 Coding Plan
Model: glm-5, kimi-k2.5, MiniMax-M2.5
▸ Step 6: 测试连接
? 是否测试连接? (Y/n)
✓ 连接成功!
响应时间: 234ms
▸ Step 7: 切换 Provider
? 是否立即切换到该 Provider? (Y/n)
✓ 已切换到 智谱 Coding Plan
请重启终端或新开会话以使配置生效
___ ___ / ___ ( ) __ ___
// ) ) // ) ) //\ \ / / / /
// // // \ \ / / / /
((____ ((____ // \ \ / / / /
配置完成! ✓Example Session (Edit Existing Provider)
$ cckit interactive
▸ Step 1: 选择要配置的 AI 提供商
? 搜索或选择提供商 >> zhipu
智谱 Coding Plan (GLM)
# Detects existing configuration, asks to edit or create new
? 发现已有的 智谱 Coding Plan 配置
智谱 Coding Plan (模型: glm-5)
❯ 创建新的 智谱 Coding Plan 配置
▸ Step 2: 编辑 智谱 Coding Plan
提示: 按 Enter 可保持当前值不变
? 请输入 API Key (按 Enter 保持不变,当前: sk-1234***): ← press Enter to skip
▸ Step 3: 设置模型
? 请输入模型名称 (多个模型用逗号分隔): glm-5, kimi-k2.5, MiniMax-M2.5 ← pre-filled, press Enter to keep
▸ Step 5: 保存配置
✓ 配置已保存 (id: zhipu_xxx)The interactive wizard will guide you through:
- Searching/selecting an AI provider (supports keyword filtering)
- If existing provider found: choose to edit or create new
- Entering your API key (pre-filled when editing, press Enter to skip)
- Setting up the model name(s) (pre-filled when editing)
- (Optional) Configuring different models for Sonnet/Haiku/Opus tiers
- Testing the connection
- Switching to the configured provider
Usage
Basic Commands
# Show help
cckit --help
# List all configured providers
cckit list
# Show current active provider
cckit current
cckit current --tool opencode # for specific tool
cckit current --tool codex
# Configure a provider
cckit configure <provider> --api-key <key> [--base-url <url>] [--model <model>] ...
# Configure a custom provider (local LLM, OpenAI-compatible API, etc.)
cckit configure my-ollama --api-key "ollama-key" --base-url "http://localhost:11434/v1" --model "llama3"
# Switch to a provider (direct mode with argument)
cckit switch <provider>
cckit switch <provider> --tool opencode # for OpenCode
cckit switch <provider> --tool codex # for Codex
# Switch to a provider (interactive mode without argument)
cckit switch
# Show provider configuration
cckit show <provider>
# Test provider connection
cckit test <provider>
# List models for a provider
cckit models <provider>
# Set active model for a provider
cckit set-model <provider> <model>
# Set model for a specific model tier (Sonnet/Haiku/Opus)
cckit set-model <provider> <model> --type sonnet
cckit set-model <provider> <model> --type haiku
cckit set-model <provider> <model> --type opus
# Remove a model from a provider
cckit remove-model <provider> <model>Supported Providers
智谱 Coding Plan - https://www.bigmodel.cn/claude-code?ic=AFDPNDPWIF
# Configure 智谱 Coding Plan with a single model
cckit configure zhipu --api-key "your-api-key" --model "GLM-4.7"
# Configure with multiple models
cckit configure zhipu --api-key "your-api-key" --model "GLM-4.7" --model "GLM-4-Plus"
# List models
cckit models zhipu
# Set active model
cckit set-model zhipu "GLM-4-Plus"
# Set model for specific tier
cckit set-model zhipu "GLM-4-Plus" --type sonnet
cckit set-model zhipu "GLM-4.7" --type haiku
# Remove a model
cckit remove-model zhipu "GLM-4.7"
# Switch to 智谱 Coding Plan
cckit switch zhipuMiniMax Coding Plan - https://platform.minimaxi.com/subscribe/coding-plan
# Configure MiniMax Coding Plan with multiple models
cckit configure minimax --api-key "your-api-key" --model "MiniMax-M2" --model "MiniMax-M4"
# List models
cckit models minimax
# Set active model
cckit set-model minimax "MiniMax-M4"
# Set model for specific tier
cckit set-model minimax "MiniMax-M4" --type sonnet
# Remove a model
cckit remove-model minimax "MiniMax-M2"
# Switch to MiniMax Coding Plan
cckit switch minimaxKimi Coding Plan - https://www.kimi.com/coding/docs/
# Configure Kimi Coding Plan with multiple models
cckit configure kimi --api-key "your-api-key" --model "kimi-for-coding" --model "kimi-plus"
# List models
cckit models kimi
# Set active model
cckit set-model kimi "kimi-plus"
# Switch to Kimi Coding Plan
cckit switch kimiZenMux.ai - https://zenmux.ai/invite/ZAEJCE
# Configure ZenMux.ai (Anthropic Compatible API Gateway) with multiple models
cckit configure zenmux --api-key "your-zenmux-api-key" --model "claude-3-5-sonnet-20241022" --model "claude-3-opus-20250219"
# List models
cckit models zenmux
# Set active model
cckit set-model zenmux "claude-3-opus-20250219"
# Switch to ZenMux.ai
cckit switch zenmuxClaude (Official) - https://console.anthropic.com
# Configure Claude (official) with multiple models
cckit configure claude --api-key "sk-ant-api03-your-key" --model "claude-3-5-sonnet-20241022" --model "claude-3-opus-20250219"
# List models
cckit models claude
# Set active model
cckit set-model claude "claude-3-opus-20250219"
# Switch to Claude
cckit switch claude快手 StreamLake Coding Plan - 快手万擎引擎
# Configure 快手 StreamLake Coding Plan with multiple models
cckit configure streamlake --api-key "your-streamlake-api-key" --model "kat-coder-pro-v1" --model "claude-3-opus-20250219"
# List models
cckit models streamlake
# Set active model
cckit set-model streamlake "claude-3-opus-20250219"
# Switch to 快手 StreamLake Coding Plan
cckit switch streamlake火山引擎 Coding Plan 火山方舟
# Configure 火山引擎 Coding Plan with multiple models
cckit configure volcengine --api-key "your-ark-api-key" --model "ark-code-latest" --model "ark-code-pro"
# List models
cckit models volcengine
# Set active model
cckit set-model volcengine "ark-code-pro"
# Switch to 火山引擎 Coding Plan
cckit switch volcengineAliyun 阿里云 Coding Plan https://www.aliyun.com/benefit/scene/codingplan
# Configure Aliyun Coding Plan with multiple models
cckit configure aliyun --api-key "your-api-key" --model "qwen3.5-plus" --model "kimi-k2.5" --model "glm-5" --model "MiniMax-M2.5"
# List models
cckit models aliyun
# Set active model
cckit set-model aliyun "glm-5"
# Set model for specific tier
cckit set-model aliyun "glm-5" --type sonnet
cckit set-model aliyun "kimi-k2.5" --type haiku
cckit set-model aliyun "MiniMax-M2.5" --type opus
# Switch to Aliyun
cckit switch aliyunTencent [腾讯云 Coding Plan]
# Configure Tencent Coding Plan with multiple models
cckit configure tencent --api-key "your-api-key" --model "GLM-5" --model "MiniMax-M2.5" --model "Kimi-K2.5"
# List models
cckit models tencent
# Set active model
cckit set-model tencent "MiniMax-M2.5"
# Switch to Tencent
cckit switch tencentXFYun [讯飞 Coding Plan]
# Configure XFYun Coding Plan with multiple models
cckit configure xfyun --api-key "your-api-key" --model "GLM-5" --model "MiniMax-M2.5" --model "Kimi-K2.5"
# List models
cckit models xfyun
# Set active model
cckit set-model xfyun "MiniMax-M2.5"
# Switch to XFYun
cckit switch xfyunJDCloud [京东云 Coding Plan]
京东云 Coding Plan 提供多种主流大模型支持,包括 GLM-5、GLM-4.7、Kimi-K2.5、Qwen3-Coder 等。
# Configure JDCloud Coding Plan with multiple models
cckit configure jdcloud --api-key "your-api-key" --model "GLM-5" --model "GLM-4.7" --model "Kimi-K2.5" --model "Qwen3-Coder"
# Or use Chinese alias
cckit configure 京东云 --api-key "your-api-key" --model "GLM-5"
# List models
cckit models jdcloud
# Set active model
cckit set-model jdcloud "Kimi-K2.5"
# Set model for specific tier
cckit set-model jdcloud "GLM-5" --type sonnet
cckit set-model jdcloud "Qwen3-Coder" --type haiku
cckit set-model jdcloud "GLM-4.7" --type opus
# Switch to JDCloud
cckit switch jdcloudBDCloud [百度云 Coding Plan]
百度云 Coding Plan 提供多种主流大模型支持,包括 GLM-5、Kimi-K2.5、MiniMax-M2.5 等。
# Configure BDCloud Coding Plan with multiple models
cckit configure bdcloud --api-key "your-api-key" --model "GLM-5" --model "Kimi-K2.5" --model "MiniMax-M2.5"
# Or use Chinese alias
cckit configure 百度云 --api-key "your-api-key" --model "GLM-5"
# List models
cckit models bdcloud
# Set active model
cckit set-model bdcloud "Kimi-K2.5"
# Set model for specific tier
cckit set-model bdcloud "GLM-5" --type sonnet
cckit set-model bdcloud "MiniMax-M2.5" --type haiku
# Switch to BDCloud
cckit switch bdcloudXiaomi [小米 Token Plan]
小米 Token Plan 提供 mimo 系列模型支持,包括 mimo-v2-pro、mimo-v2-omni、mimo-v2-flash。
# Configure Xiaomi Token Plan with multiple models
cckit configure xiaomi --api-key "your-api-key" --model "mimo-v2-pro" --model "mimo-v2-omni" --model "mimo-v2-flash"
# Or use Chinese alias
cckit configure 小米 --api-key "your-api-key" --model "mimo-v2-pro"
# List models
cckit models xiaomi
# Set active model
cckit set-model xiaomi "mimo-v2-omni"
# Set model for specific tier
cckit set-model xiaomi "mimo-v2-pro" --type sonnet
cckit set-model xiaomi "mimo-v2-flash" --type haiku
# Switch to Xiaomi
cckit switch xiaomiCustom Provider - Local/OpaqueAI/OpenAI Compatible APIs
Configure custom providers for local LLMs (Ollama, LocalAI, LM Studio), opaque API endpoints, or any Anthropic-compatible API:
# Configure a local Ollama instance
cckit configure ollama --api-key "ollama-key" --base-url "http://localhost:11434/v1" --model "llama3"
# Configure using the 'custom:' prefix (explicit custom provider)
cckit configure custom:my-ollama --api-key "ollama-key" --base-url "http://localhost:11434/v1" --model "llama3"
# Configure for OpenAI compatible API (e.g., via proxy)
cckit configure openwebui --api-key "your-token" --base-url "http://localhost:8080/v1" --model "anthropic/claude-3-haiku"
# List models
cckit models ollama
# Set active model
cckit set-model ollama "llama3"
# Switch to custom provider
cckit switch ollamaCustom providers support:
- Local LLM servers (Ollama, LocalAI, LM Studio, llama.cpp)
- OpenAI-compatible API endpoints
- Any Anthropic-compatible gateway or proxy
- Opaque/enterprise API endpoints
The custom provider uses provider_name as the CC_PROVIDER environment variable value, allowing you to distinguish between multiple custom providers.
Multi-Tool Support
cckit 支持为多种 AI 编码工具管理 Provider 配置:
Supported Tools
| Tool | Config Path | Description |
|------|-------------|-------------|
| claude-code | ~/.claude/settings.json | Claude Code CLI (默认) |
| opencode | ~/.opencode.json | OpenCode CLI |
| codex | ~/.codex/config.toml | OpenAI Codex CLI |
Switching Providers for Different Tools
# 切换 Claude Code 的 provider (默认)
cckit switch zhipu
cckit switch zhipu --tool claude-code
# 切换 OpenCode 的 provider
cckit switch zhipu --tool opencode
# 切换 Codex 的 provider
cckit switch zhipu --tool codex
# 交互式选择工具
cckit switchViewing Current Provider
# 查看 Claude Code 当前 provider
cckit current
cckit current --tool claude-code
# 查看 OpenCode 当前 provider
cckit current --tool opencode
# 查看 Codex 当前 provider
cckit current --tool codexTool-Specific Configuration
每个工具维护独立的 provider 状态,可以为不同工具配置不同的 provider:
# Claude Code 使用智谱
cckit switch zhipu --tool claude-code
# OpenCode 使用阿里云
cckit switch aliyun --tool opencode
# Codex 使用 MiniMax
cckit switch minimax --tool codexCodex Notes
Codex CLI 使用 OpenAI Responses API 格式,切换时注意事项:
模型支持: 请确保所选模型支持 OpenAI Responses API
配置文件格式 (
~/.codex/config.toml):model = "glm-5" model_provider = "cckit-zhipu" [model_providers.cckit-zhipu] name = "智谱 Coding Plan" base_url = "https://open.bigmodel.cn/api/paas/v4" wire_api = "responses" requires_openai_auth = false experimental_bearer_token = "your-api-key"
OpenCode Notes
OpenCode 配置存储在 ~/.opencode.json:
{
"providers": {
"cckit-zhipu": {
"apiKey": "your-api-key",
"disabled": false,
"baseUrl": "https://open.bigmodel.cn/api/anthropic"
}
},
"agents": {
"coder": { "model": "glm-5" },
"task": { "model": "glm-5" }
}
}Advanced Commands
# Export configuration to file
cckit export --output backup.json
# Import configuration from file
cckit import backup.json
# Reset to default Claude configuration
cckit reset
# Interactive configuration wizard (new!)
cckit interactiveConfiguration
The CLI stores configurations in ~/.cckit/config.json and updates Claude's settings in ~/.claude/settings.json.
Multi-Model Support
Each provider can now be configured with multiple models. You can:
- Configure multiple models during setup:
cckit configure <provider> --model <model1> --model <model2> - Add models to existing providers:
cckit configure <provider> --model <new-model> - List available models:
cckit models <provider> - Switch active model:
cckit set-model <provider> <model-name>
The active model is used when Claude Code switches to that provider. If no explicit active model is set, the first configured model is used.
Model Tier Configuration
Claude Code supports different model tiers (Sonnet, Haiku, Opus) for different use cases. You can configure different models for each tier:
Command Line
# Set default model (ANTHROPIC_MODEL)
cckit set-model aliyun "glm-5"
# Set Sonnet model (ANTHROPIC_DEFAULT_SONNET_MODEL)
cckit set-model aliyun "glm-5" --type sonnet
# Set Haiku model (ANTHROPIC_DEFAULT_HAIKU_MODEL)
cckit set-model aliyun "kimi-k2.5" --type haiku
# Set Opus model (ANTHROPIC_DEFAULT_OPUS_MODEL)
cckit set-model aliyun "MiniMax-M2.5" --type opusInteractive Mode
In interactive mode (cckit interactive), after entering model names, you'll be asked:
? 是否为不同模型类型 (Sonnet/Haiku/Opus) 配置不同的默认模型?Select "Yes" to configure each model tier. You can choose "跳过 (使用默认模型)" for any tier you don't want to customize.
View Configuration
# Show provider configuration including model tier settings
cckit show aliyunOutput:
Configuration: 阿里云 Coding Plan
ID: bf76bc20-87f1-4bd4-b091-20b9d1eabdb9
Name: Aliyun Coding Plan
Type: aliyun
...
Models: qwen3.5-plus, kimi-k2.5, glm-5, MiniMax-M2.5
Active: glm-5
Sonnet: glm-5
Haiku: kimi-k2.5
Opus: MiniMax-M2.5Validation
The model must be in the provider's model list. If you try to set a model that doesn't exist:
$ cckit set-model aliyun "non-existent-model" --type sonnet
Error: Model 'non-existent-model' not found in provider. Available models: qwen3.5-plus, kimi-k2.5, glm-5, MiniMax-M2.5Provider Details
| Provider | Default Model | Auth Method | Capabilities | Default Base URL | |----------|---------------|-------------|--------------|------------------| | 智谱 Coding Plan | GLM-4.7 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support | https://open.bigmodel.cn/api/anthropic | | MiniMax Coding Plan | MiniMax-M2 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Multi-language | https://api.minimaxi.com/anthropic | | Kimi Coding Plan | kimi-for-coding | ANTHROPIC_API_KEY | Chat, Long Context, Code Generation | https://api.kimi.com/coding/ | | ZenMux.ai | claude-3-5-sonnet-20241022 | ANTHROPIC_API_KEY | Model Gateway, Multi-provider, Claude Compatible | https://zenmux.ai/api/anthropic | | 快手 StreamLake Coding Plan | kat-coder-pro-v1 | ANTHROPIC_API_KEY | Chat, Code Generation, Chinese Support, Video Understanding | https://wanqing.streamlakeapi.com/api/gateway/v1/endpoints/kat-coder-pro-v1/claude-code-proxy | | 火山引擎 Coding Plan | ark-code-latest | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support | https://ark.cn-beijing.volces.com/api/coding | | 阿里云 Coding Plan | claude-sonnet-4-20250514 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://coding.dashscope.aliyuncs.com/apps/anthropic | | 腾讯云 Coding Plan | GLM-5 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://api.lkeap.cloud.tencent.com/coding/anthropic | | 讯飞 Coding Plan | GLM-5 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://maas-coding-api.cn-huabei-1.xf-yun.com/anthropic | | 京东云 Coding Plan | claude-sonnet-4-20250514 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://modelservice.jdcloud.com/coding/anthropic | | 百度云 Coding Plan | claude-sonnet-4-20250514 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://qianfan.baidubce.com/anthropic/coding | | 小米 Token Plan | mimo-v2-pro | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://token-plan-cn.xiaomimimo.com/anthropic | | Claude | claude-3-5-sonnet-20241022 | ANTHROPIC_API_KEY | Chat, Code Generation, Analysis, Multimodal | Official | | Custom | - | ANTHROPIC_API_KEY | Custom, User-defined, Local LLM, Anthropic Compatible API | User-defined |
How It Works
- Configuration Management: Stores provider settings in
~/.cckit/config.json - Claude Integration: Updates Claude's
settings.jsonwith environment variables - Switching: Changes the active provider and updates Claude's configuration
- Testing: Sends test messages to verify provider connectivity
Development
# Install dependencies
pnpm install
# Run in development mode (uses ts-node, no compilation needed)
pnpm run dev list
# Build for development (no minification)
pnpm run build
# Build for production (with minification, ~68% smaller)
pnpm run build:prod
# Test the CLI
node dist/index.js listFor more build options and details, see BUILD.md.
Requirements
- Node.js 20 or higher
- pnpm (or npm/yarn)
- Claude Code installed
- Valid API keys for the providers you want to use
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Build and test:
pnpm run build - Submit a pull request
Support
If you encounter any issues or have questions, please open an issue on the GitHub repository.
Differences from Rust Version
The Node.js version maintains feature parity with the Rust version while leveraging Node.js/TypeScript advantages:
- TypeScript: Full type safety and better IDE support
- Cross-platform: Better Windows compatibility out of the box
- Easier distribution: No compilation required for end users
- Same CLI interface: Identical command structure and behavior
- Localization: Same i18n system with English and Chinese support
