@visioncraft3r/claude-code-router-custom
v1.0.0
Published
Custom version of Claude Code Router with BMAD-specific features: automatic agent detection, project management UI, and per-agent model assignment
Maintainers
Readme

Customised by VisonCraft3r (powered by AI)
🎯 Custom Version Features
This is a custom version of Claude Code Router Custom (specifically made for BMAD) with the following additional features:
CLI Model Management
Manage models and providers directly from the terminal with ccr model:
ccr model
This command provides an interactive interface to:
- View current configuration
- See all configured models (default, background, think, longContext, webSearch, image)
- Switch models: Quickly change which model is used for each router type
- Add new models: Add models to existing providers
- Create new providers: Set up complete provider configurations including:
- Provider name and API endpoint
- API key
- Available models
- Transformer configuration with support for:
- Multiple transformers (openrouter, deepseek, gemini, etc.)
- Transformer options (e.g., maxtoken with custom limits)
- Provider-specific routing (e.g., OpenRouter provider preferences)
The CLI tool validates all inputs and provides helpful prompts to guide you through the configuration process, making it easy to manage complex setups without editing JSON files manually.
Project Management BMAD
The UI includes a powerful Project Management feature that allows you to:
- Add Projects: Scan directories for Claude Code agent files (
.mdfiles in.claude/agents/or.bmad/bmm/agents/directories) - List Projects: View all registered projects with their detected agents
- Assign Models to Agents: Set specific models for each agent through the UI
- Delete Projects: Remove projects and automatically clean up CCR tags from agent files
How it works:
Adding a Project:
- Click "Add Project" in the UI
- Enter the project directory path (e.g.,
/Users/username/my-project) - CCR automatically scans for agent files and registers them
Agent Detection:
- CCR looks for
.mdfiles in:.claude/agents/directory.bmad/bmm/agents/directory
- Each agent file is assigned a unique ID and tracked in
~/.claude-code-router/projects.json
- CCR looks for
Model Assignment:
- Select an agent from the project list
- Choose a model from the dropdown (or set to "default" to use router defaults)
- The assignment is saved both in
projects.jsonand embedded in the agent's.mdfile as<!-- CCR-AGENT-MODEL: provider,model -->
Automatic Routing:
- When you activate an agent in Claude Code, CCR detects the agent ID from the system prompt
- The router automatically selects the assigned model for that agent
- Model assignments are cached per session for performance
Cleanup:
- When you delete a project, CCR automatically removes all
CCR-AGENT-IDandCCR-AGENT-MODELtags from the agent files
- When you delete a project, CCR automatically removes all
Example Agent File Structure:
After adding a project and assigning a model, your agent file will look like:
# My Agent
This is my agent persona...
<!-- CCR-AGENT-ID: 550e8400-e29b-41d4-a716-446655440000 -->
<!-- CCR-AGENT-MODEL: deepseek,deepseek-reasoner -->Model Identity in Responses:
When an agent with an assigned model is active, the model will automatically report its identity at the start of responses:
[CCR: Active Agent: 550e8400-e29b-41d4-a716-446655440000 (deepseek,deepseek-reasoner)]
Your response here...This makes it easy to verify that the correct model is being used for each agent.
Activate Command (Environment Variables Setup)
The activate command allows you to set up environment variables globally in your shell, enabling you to use the claude command directly or integrate Claude Code Router Custom with applications built using the Agent SDK.
To activate the environment variables, run:
eval "$(ccr activate)"This command outputs the necessary environment variables in shell-friendly format, which are then set in your current shell session. After activation, you can:
- Use
claudecommand directly: Runclaudecommands without needing to useccr code. Theclaudecommand will automatically route requests through Claude Code Router Custom. - Integrate with Agent SDK applications: Applications built with the Anthropic Agent SDK will automatically use the configured router and models.
The activate command sets the following environment variables:
ANTHROPIC_AUTH_TOKEN: API key from your configurationANTHROPIC_BASE_URL: The local router endpoint (default:http://127.0.0.1:3456)NO_PROXY: Set to127.0.0.1to prevent proxy interferenceDISABLE_TELEMETRY: Disables telemetryDISABLE_COST_WARNINGS: Disables cost warningsAPI_TIMEOUT_MS: API timeout from your configuration
Note: Make sure the Claude Code Router Custom service is running (
ccr start) before using the activated environment variables. The environment variables are only valid for the current shell session. To make them persistent, you can addeval "$(ccr activate)"to your shell configuration file (e.g.,~/.zshrcor~/.bashrc).
✨ Features
- Model Routing: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
- Multi-Provider Support: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
- Request/Response Transformation: Customize requests and responses for different providers using transformers.
- Dynamic Model Switching: Switch models on-the-fly within Claude Code using the
/modelcommand. - GitHub Actions Integration: Trigger Claude Code tasks in your GitHub workflows.
- Plugin System: Extend functionality with custom transformers.
🚀 Getting Started
1. Installation
First, ensure you have Claude Code installed:
npm install -g @anthropic-ai/claude-codeThen, install Claude Code Router Custom:
npm install -g @musistudio/claude-code-router2. Start the Server
Start the Claude Code Router Custom server:
ccr start3. Set Up Agents Using the UI
The easiest way to configure Claude Code Router Custom is through the web UI, which makes it simple to set up agents and assign models.
Launch the UI
ccr uiThis opens a web-based interface in your browser where you can manage everything visually.

Set Up Your Projects and Agents
Add Your Project:
- Click "Add Project" in the UI
- Enter your project directory path (e.g.,
/Users/username/my-project) - The UI automatically scans for agent files in
.claude/agents/or.bmad/bmm/agents/directories
Configure Providers and Models:
- Go to the "Providers" section
- Add your API providers (OpenRouter, DeepSeek, etc.) with their API keys
- Add models for each provider
Assign Models to Agents:
- Select a project from your project list
- Choose an agent
- Assign a specific model from the dropdown
- The model assignment is automatically saved and will be used when that agent is active
Configure Router Settings:
- Set default models for different scenarios (background, thinking, long context, etc.)
- All changes are saved automatically
Start Using Claude Code
Once your agents are set up, start Claude Code with the router:
ccr codeWhen you activate an agent in Claude Code, it will automatically use the model you assigned in the UI. The active agent and model are displayed at the start of each response for verification.
Note: After making changes in the UI, you may need to restart the service:
ccr restart
4. Advanced Configuration
For advanced users who prefer manual configuration, you can edit the ~/.claude-code-router/config.json file directly. For more details, you can refer to config.example.json.
Configuration File Structure
The config.json file has several key sections:
PROXY_URL(optional): You can set a proxy for API requests, for example:"PROXY_URL": "http://127.0.0.1:7890".LOG(optional): You can enable logging by setting it totrue. When set tofalse, no log files will be created. Default istrue.LOG_LEVEL(optional): Set the logging level. Available options are:"fatal","error","warn","info","debug","trace". Default is"debug".- Logging Systems: The Claude Code Router Custom uses two separate logging systems:
- Server-level logs: HTTP requests, API calls, and server events are logged using pino in the
~/.claude-code-router/logs/directory with filenames likeccr-*.log - Application-level logs: Routing decisions and business logic events are logged in
~/.claude-code-router/claude-code-router.log
- Server-level logs: HTTP requests, API calls, and server events are logged using pino in the
APIKEY(optional): You can set a secret key to authenticate requests. When set, clients must provide this key in theAuthorizationheader (e.g.,Bearer your-secret-key) or thex-api-keyheader. Example:"APIKEY": "your-secret-key".HOST(optional): You can set the host address for the server. IfAPIKEYis not set, the host will be forced to127.0.0.1for security reasons to prevent unauthorized access. Example:"HOST": "0.0.0.0".NON_INTERACTIVE_MODE(optional): When set totrue, enables compatibility with non-interactive environments like GitHub Actions, Docker containers, or other CI/CD systems. This sets appropriate environment variables (CI=true,FORCE_COLOR=0, etc.) and configures stdin handling to prevent the process from hanging in automated environments. Example:"NON_INTERACTIVE_MODE": true.Providers: Used to configure different model providers.Router: Used to set up routing rules.defaultspecifies the default model, which will be used for all requests if no other route is configured.API_TIMEOUT_MS: Specifies the timeout for API calls in milliseconds.
Environment Variable Interpolation
Claude Code Router Custom supports environment variable interpolation for secure API key management. You can reference environment variables in your config.json using either $VAR_NAME or ${VAR_NAME} syntax:
{
"OPENAI_API_KEY": "$OPENAI_API_KEY",
"GEMINI_API_KEY": "${GEMINI_API_KEY}",
"Providers": [
{
"name": "openai",
"api_base_url": "https://api.openai.com/v1/chat/completions",
"api_key": "$OPENAI_API_KEY",
"models": ["gpt-5", "gpt-5-mini"]
}
]
}This allows you to keep sensitive API keys in environment variables instead of hardcoding them in configuration files. The interpolation works recursively through nested objects and arrays.
Claude Code 2.0+ Compatibility
Claude Code Router Custom is fully compatible with Claude Code 2.0.49 and later versions. Starting from Claude Code 2.0, the --settings flag expects a file path instead of inline JSON. CCR automatically handles this change.
Automatic Version Detection:
By default, CCR detects your Claude Code version and uses the appropriate method:
- Claude Code 2.0+: Creates temporary settings files
- Claude Code 1.x: Uses inline JSON (legacy mode)
- Unknown version: Defaults to file-based approach
Manual Configuration (Optional):
You can override automatic detection in your config.json:
{
"SETTINGS_MODE": "auto",
"CLEANUP_TEMP_SETTINGS": true
}SETTINGS_MODE Options:
| Value | Description | Use Case |
|-------|-------------|----------|
| "auto" | Automatically detect version (default) | Recommended for all users |
| "file" | Force file-based settings | Claude Code 2.0+, or if auto-detection fails |
| "inline" | Force inline JSON settings | Claude Code 1.x, or testing backward compatibility |
| "env-only" | Skip --settings flag entirely | Troubleshooting, minimal setup |
CLEANUP_TEMP_SETTINGS Options:
| Value | Description |
|-------|-------------|
| true (default) | Automatically delete temp files after execution |
| false | Keep temp files for debugging (in /tmp or %TEMP%) |
Troubleshooting:
Status line not appearing?
- Check that
StatusLine.enabled: truein your config - Verify Claude Code version:
claude --version - Try forcing file mode:
"SETTINGS_MODE": "file" - Check CCR logs:
tail -f ~/.claude-code-router/claude-code-router.log
- Check that
"Failed to create temp settings file" error?
- Ensure
/tmp(or%TEMP%on Windows) is writable - Check disk space:
df -h /tmp - Try env-only mode:
"SETTINGS_MODE": "env-only"
- Ensure
5. CLI Model Management
Note: This is a custom feature. See the Custom Version Features section above for detailed documentation.
For users who prefer terminal-based workflows, you can use the interactive CLI model selector with ccr model. See the custom features section for complete documentation.
6. Activate Command (Environment Variables Setup)
Note: This is a custom feature. See the Custom Version Features section above for detailed documentation.
The activate command allows you to set up environment variables globally in your shell. See the custom features section for complete documentation.
Providers
The Providers array is where you define the different model providers you want to use. Each provider object requires:
name: A unique name for the provider.api_base_url: The full API endpoint for chat completions.api_key: Your API key for the provider.models: A list of model names available from this provider.transformer(optional): Specifies transformers to process requests and responses.
Transformers
Transformers allow you to modify the request and response payloads to ensure compatibility with different provider APIs.
Global Transformer: Apply a transformer to all models from a provider. In this example, the
openroutertransformer is applied to all models under theopenrouterprovider.{ "name": "openrouter", "api_base_url": "https://openrouter.ai/api/v1/chat/completions", "api_key": "sk-xxx", "models": [ "google/gemini-2.5-pro-preview", "anthropic/claude-sonnet-4", "anthropic/claude-3.5-sonnet" ], "transformer": { "use": ["openrouter"] } }Model-Specific Transformer: Apply a transformer to a specific model. In this example, the
deepseektransformer is applied to all models, and an additionaltoolusetransformer is applied only to thedeepseek-chatmodel.{ "name": "deepseek", "api_base_url": "https://api.deepseek.com/chat/completions", "api_key": "sk-xxx", "models": ["deepseek-chat", "deepseek-reasoner"], "transformer": { "use": ["deepseek"], "deepseek-chat": { "use": ["tooluse"] } } }Passing Options to a Transformer: Some transformers, like
maxtoken, accept options. To pass options, use a nested array where the first element is the transformer name and the second is an options object.{ "name": "siliconflow", "api_base_url": "https://api.siliconflow.cn/v1/chat/completions", "api_key": "sk-xxx", "models": ["moonshotai/Kimi-K2-Instruct"], "transformer": { "use": [ [ "maxtoken", { "max_tokens": 16384 } ] ] } }
Available Built-in Transformers:
Anthropic:If you use only theAnthropictransformer, it will preserve the original request and response parameters(you can use it to connect directly to an Anthropic endpoint).deepseek: Adapts requests/responses for DeepSeek API.gemini: Adapts requests/responses for Gemini API.openrouter: Adapts requests/responses for OpenRouter API. It can also accept aproviderrouting parameter to specify which underlying providers OpenRouter should use. For more details, refer to the OpenRouter documentation. See an example below:"transformer": { "use": ["openrouter"], "moonshotai/kimi-k2": { "use": [ [ "openrouter", { "provider": { "only": ["moonshotai/fp8"] } } ] ] } }groq: Adapts requests/responses for groq API.maxtoken: Sets a specificmax_tokensvalue.tooluse: Optimizes tool usage for certain models viatool_choice.gemini-cli(experimental): Unofficial support for Gemini via Gemini CLI gemini-cli.js.reasoning: Used to process thereasoning_contentfield.sampling: Used to process sampling information fields such astemperature,top_p,top_k, andrepetition_penalty.enhancetool: Adds a layer of error tolerance to the tool call parameters returned by the LLM (this will cause the tool call information to no longer be streamed).cleancache: Clears thecache_controlfield from requests.vertex-gemini: Handles the Gemini API using Vertex authentication.chutes-glmUnofficial support for GLM 4.5 model via Chutes chutes-glm-transformer.js.qwen-cli(experimental): Unofficial support for qwen3-coder-plus model via Qwen CLI qwen-cli.js.rovo-cli(experimental): Unofficial support for gpt-5 via Atlassian Rovo Dev CLI rovo-cli.js.
Custom Transformers:
You can also create your own transformers and load them via the transformers field in config.json.
{
"transformers": [
{
"path": "/User/xxx/.claude-code-router/plugins/gemini-cli.js",
"options": {
"project": "xxx"
}
}
]
}Router
The Router object defines which model to use for different scenarios:
default: The default model for general tasks.background: A model for background tasks. This can be a smaller, local model to save costs.think: A model for reasoning-heavy tasks, like Plan Mode.longContext: A model for handling long contexts (e.g., > 60K tokens).longContextThreshold(optional): The token count threshold for triggering the long context model. Defaults to 60000 if not specified.webSearch: Used for handling web search tasks and this requires the model itself to support the feature. If you're using openrouter, you need to add the:onlinesuffix after the model name.image(beta): Used for handling image-related tasks (supported by CCR’s built-in agent). If the model does not support tool calling, you need to set theconfig.forceUseImageAgentproperty totrue.You can also switch models dynamically in Claude Code with the
/modelcommand:/model provider_name,model_nameExample:/model openrouter,anthropic/claude-3.5-sonnet
Custom Router
For more advanced routing logic, you can specify a custom router script via the CUSTOM_ROUTER_PATH in your config.json. This allows you to implement complex routing rules beyond the default scenarios.
In your config.json:
{
"CUSTOM_ROUTER_PATH": "/User/xxx/.claude-code-router/custom-router.js"
}The custom router file must be a JavaScript module that exports an async function. This function receives the request object and the config object as arguments and should return the provider and model name as a string (e.g., "provider_name,model_name"), or null to fall back to the default router.
Here is an example of a custom-router.js based on custom-router.example.js:
// /User/xxx/.claude-code-router/custom-router.js
/**
* A custom router function to determine which model to use based on the request.
*
* @param {object} req - The request object from Claude Code, containing the request body.
* @param {object} config - The application's config object.
* @returns {Promise<string|null>} - A promise that resolves to the "provider,model_name" string, or null to use the default router.
*/
module.exports = async function router(req, config) {
const userMessage = req.body.messages.find((m) => m.role === "user")?.content;
if (userMessage && userMessage.includes("explain this code")) {
// Use a powerful model for code explanation
return "openrouter,anthropic/claude-3.5-sonnet";
}
// Fallback to the default router configuration
return null;
};Subagent Routing
For routing within subagents, you must specify a particular provider and model by including <CCR-SUBAGENT-MODEL>provider,model</CCR-SUBAGENT-MODEL> at the beginning of the subagent's prompt. This allows you to direct specific subagent tasks to designated models.
Example:
<CCR-SUBAGENT-MODEL>openrouter,anthropic/claude-3.5-sonnet</CCR-SUBAGENT-MODEL>
Please help me analyze this code snippet for potential optimizations...Status Line (Beta)
To better monitor the status of claude-code-router at runtime, version v1.0.40 includes a built-in statusline tool, which you can enable in the UI.

The effect is as follows:

🤖 GitHub Actions
Integrate Claude Code Router Custom into your CI/CD pipeline. After setting up Claude Code Actions, modify your .github/workflows/claude.yaml to use the router:
name: Claude Code
on:
issue_comment:
types: [created]
# ... other triggers
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
# ... other conditions
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Prepare Environment
run: |
curl -fsSL https://bun.sh/install | bash
mkdir -p $HOME/.claude-code-router
cat << 'EOF' > $HOME/.claude-code-router/config.json
{
"log": true,
"NON_INTERACTIVE_MODE": true,
"OPENAI_API_KEY": "${{ secrets.OPENAI_API_KEY }}",
"OPENAI_BASE_URL": "https://api.deepseek.com",
"OPENAI_MODEL": "deepseek-chat"
}
EOF
shell: bash
- name: Start Claude Code Router Custom
run: |
nohup ~/.bun/bin/bunx @musistudio/[email protected] start &
shell: bash
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@beta
env:
ANTHROPIC_BASE_URL: http://localhost:3456
with:
anthropic_api_key: "any-string-is-ok"Note: When running in GitHub Actions or other automation environments, make sure to set
"NON_INTERACTIVE_MODE": truein your configuration to prevent the process from hanging due to stdin handling issues.
This setup allows for interesting automations, like running tasks during off-peak hours to reduce API costs.
📝 Further Reading
- Project Motivation and How It Works
- Maybe We Can Do More with the Router
- GLM-4.6 Supports Reasoning and Interleaved Thinking
the original creator of the Claude code Router
Original Claude Code Router Custom : https://github.com/musistudio/claude-code-router
