A powerful tool to route Claude Code requests to different models and customize any request.
- Model Routing: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
- Multi-Provider Support: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
- Request/Response Transformation: Customize requests and responses for different providers using transformers.
- Dynamic Model Switching: Switch models on-the-fly within Claude Code using the /model command.
- GitHub Actions Integration: Trigger Claude Code tasks in your GitHub workflows.
- Plugin System: Extend functionality with custom transformers.
First, ensure you have Claude Code installed:
Then, install Claude Code Router:
Create and configure your ~/.claude-code-router/config.json file. For more details, you can refer to config.example.json.
The config.json file has several key sections:
-
PROXY_URL (optional): You can set a proxy for API requests, for example: "PROXY_URL": "http://127.0.0.1:7890".
-
LOG (optional): You can enable logging by setting it to true. The log file will be located at $HOME/.claude-code-router.log.
-
APIKEY (optional): You can set a secret key to authenticate requests. When set, clients must provide this key in the Authorization header (e.g., Bearer your-secret-key) or the x-api-key header. Example: "APIKEY": "your-secret-key".
-
HOST (optional): You can set the host address for the server. If APIKEY is not set, the host will be forced to 127.0.0.1 for security reasons to prevent unauthorized access. Example: "HOST": "0.0.0.0".
-
Providers: Used to configure different model providers.
-
Router: Used to set up routing rules. default specifies the default model, which will be used for all requests if no other route is configured.
Here is a comprehensive example:
Start Claude Code using the router:
The Providers array is where you define the different model providers you want to use. Each provider object requires:
- name: A unique name for the provider.
- api_base_url: The full API endpoint for chat completions.
- api_key: Your API key for the provider.
- models: A list of model names available from this provider.
- transformer (optional): Specifies transformers to process requests and responses.
Transformers allow you to modify the request and response payloads to ensure compatibility with different provider APIs.
-
Global Transformer: Apply a transformer to all models from a provider. In this example, the openrouter transformer is applied to all models under the openrouter provider.
{ "name": "openrouter", "api_base_url": "https://openrouter.ai/api/v1/chat/completions", "api_key": "sk-xxx", "models": [ "google/gemini-2.5-pro-preview", "anthropic/claude-sonnet-4", "anthropic/claude-3.5-sonnet" ], "transformer": { "use": ["openrouter"] } } -
Model-Specific Transformer: Apply a transformer to a specific model. In this example, the deepseek transformer is applied to all models, and an additional tooluse transformer is applied only to the deepseek-chat model.
{ "name": "deepseek", "api_base_url": "https://api.deepseek.com/chat/completions", "api_key": "sk-xxx", "models": ["deepseek-chat", "deepseek-reasoner"], "transformer": { "use": ["deepseek"], "deepseek-chat": { "use": ["tooluse"] } } } -
Passing Options to a Transformer: Some transformers, like maxtoken, accept options. To pass options, use a nested array where the first element is the transformer name and the second is an options object.
{ "name": "siliconflow", "api_base_url": "https://api.siliconflow.cn/v1/chat/completions", "api_key": "sk-xxx", "models": ["moonshotai/Kimi-K2-Instruct"], "transformer": { "use": [ [ "maxtoken", { "max_tokens": 16384 } ] ] } }
Available Built-in Transformers:
- deepseek: Adapts requests/responses for DeepSeek API.
- gemini: Adapts requests/responses for Gemini API.
- openrouter: Adapts requests/responses for OpenRouter API.
- groq: Adapts requests/responses for groq API.
- maxtoken: Sets a specific max_tokens value.
- tooluse: Optimizes tool usage for certain models via tool_choice.
- gemini-cli (experimental): Unofficial support for Gemini via Gemini CLI gemini-cli.js.
Custom Transformers:
You can also create your own transformers and load them via the transformers field in config.json.
The Router object defines which model to use for different scenarios:
- default: The default model for general tasks.
- background: A model for background tasks. This can be a smaller, local model to save costs.
- think: A model for reasoning-heavy tasks, like Plan Mode.
- longContext: A model for handling long contexts (e.g., > 60K tokens).
- webSearch: Used for handling web search tasks and this requires the model itself to support the feature. If you're using openrouter, you need to add the :online suffix after the model name.
You can also switch models dynamically in Claude Code with the /model command: /model provider_name,model_name Example: /model openrouter,anthropic/claude-3.5-sonnet
For more advanced routing logic, you can specify a custom router script via the CUSTOM_ROUTER_PATH in your config.json. This allows you to implement complex routing rules beyond the default scenarios.
In your config.json:
The custom router file must be a JavaScript module that exports an async function. This function receives the request object and the config object as arguments and should return the provider and model name as a string (e.g., "provider_name,model_name"), or null to fall back to the default router.
Here is an example of a custom-router.js based on custom-router.example.js:
Integrate Claude Code Router into your CI/CD pipeline. After setting up Claude Code Actions, modify your .github/workflows/claude.yaml to use the router:
This setup allows for interesting automations, like running tasks during off-peak hours to reduce API costs.
If you find this project helpful, please consider sponsoring its development. Your support is greatly appreciated!
A huge thank you to all our sponsors for their generous support!
- @Simon Leischnig
- @duanshuaimin
- @vrgitadmin
- @*o
- @ceilwoo
- @*说
- @*更
- @K*g
- @R*R
- @bobleer
- @*苗
- @*划
- @Clarence-pan
- @carter003
- @S*r
- @*晖
- @*敏
- @Z*z
- @*然
- @cluic
- @*苗
- @PromptExpert
- @*应
- @yusnake
- @*飞
- @董*
- @*汀
- @*涯
- @*:-)
- @**磊
- @*琢
- @*成
- @Z*o
- @*琨
(If your name is masked, please contact me via my homepage email to update it with your GitHub username.)
.png)



