update qwen3-coder example

This commit is contained in:
musistudio
2025-07-25 17:19:52 +08:00
parent 179bab605e
commit 6883fff352
3 changed files with 124 additions and 119 deletions

223
README.md
View File

@@ -8,12 +8,12 @@
## ✨ Features
- **Model Routing**: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
- **Multi-Provider Support**: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
- **Request/Response Transformation**: Customize requests and responses for different providers using transformers.
- **Dynamic Model Switching**: Switch models on-the-fly within Claude Code using the `/model` command.
- **GitHub Actions Integration**: Trigger Claude Code tasks in your GitHub workflows.
- **Plugin System**: Extend functionality with custom transformers.
- **Model Routing**: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
- **Multi-Provider Support**: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
- **Request/Response Transformation**: Customize requests and responses for different providers using transformers.
- **Dynamic Model Switching**: Switch models on-the-fly within Claude Code using the `/model` command.
- **GitHub Actions Integration**: Trigger Claude Code tasks in your GitHub workflows.
- **Plugin System**: Extend functionality with custom transformers.
## 🚀 Getting Started
@@ -36,6 +36,7 @@ npm install -g @musistudio/claude-code-router
Create and configure your `~/.claude-code-router/config.json` file. For more details, you can refer to `config.example.json`.
The `config.json` file has several key sections:
- **`PROXY_URL`** (optional): You can set a proxy for API requests, for example: `"PROXY_URL": "http://127.0.0.1:7890"`.
- **`LOG`** (optional): You can enable logging by setting it to `true`. The log file will be located at `$HOME/.claude-code-router.log`.
- **`APIKEY`** (optional): You can set a secret key to authenticate requests. When set, clients must provide this key in the `Authorization` header (e.g., `Bearer your-secret-key`) or the `x-api-key` header. Example: `"APIKEY": "your-secret-key"`.
@@ -112,9 +113,10 @@ Here is a comprehensive example:
[
"maxtoken",
{
"max_tokens": 8192
"max_tokens": 65536
}
]
],
"enhancetool"
]
}
},
@@ -128,9 +130,10 @@ Here is a comprehensive example:
[
"maxtoken",
{
"max_tokens": 8192
"max_tokens": 65536
}
]
],
"enhancetool"
]
}
}
@@ -145,7 +148,6 @@ Here is a comprehensive example:
}
```
### 3. Running Claude Code with the Router
Start Claude Code using the router:
@@ -155,6 +157,7 @@ ccr code
```
> **Note**: After modifying the configuration file, you need to restart the service for the changes to take effect:
>
> ```shell
> ccr restart
> ```
@@ -163,73 +166,74 @@ ccr code
The `Providers` array is where you define the different model providers you want to use. Each provider object requires:
- `name`: A unique name for the provider.
- `api_base_url`: The full API endpoint for chat completions.
- `api_key`: Your API key for the provider.
- `models`: A list of model names available from this provider.
- `transformer` (optional): Specifies transformers to process requests and responses.
- `name`: A unique name for the provider.
- `api_base_url`: The full API endpoint for chat completions.
- `api_key`: Your API key for the provider.
- `models`: A list of model names available from this provider.
- `transformer` (optional): Specifies transformers to process requests and responses.
#### Transformers
Transformers allow you to modify the request and response payloads to ensure compatibility with different provider APIs.
- **Global Transformer**: Apply a transformer to all models from a provider. In this example, the `openrouter` transformer is applied to all models under the `openrouter` provider.
```json
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-xxx",
"models": [
"google/gemini-2.5-pro-preview",
"anthropic/claude-sonnet-4",
"anthropic/claude-3.5-sonnet"
],
"transformer": { "use": ["openrouter"] }
}
```
- **Model-Specific Transformer**: Apply a transformer to a specific model. In this example, the `deepseek` transformer is applied to all models, and an additional `tooluse` transformer is applied only to the `deepseek-chat` model.
```json
{
"name": "deepseek",
"api_base_url": "https://api.deepseek.com/chat/completions",
"api_key": "sk-xxx",
"models": ["deepseek-chat", "deepseek-reasoner"],
"transformer": {
"use": ["deepseek"],
"deepseek-chat": { "use": ["tooluse"] }
}
}
```
- **Global Transformer**: Apply a transformer to all models from a provider. In this example, the `openrouter` transformer is applied to all models under the `openrouter` provider.
```json
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-xxx",
"models": [
"google/gemini-2.5-pro-preview",
"anthropic/claude-sonnet-4",
"anthropic/claude-3.5-sonnet"
],
"transformer": { "use": ["openrouter"] }
}
```
- **Model-Specific Transformer**: Apply a transformer to a specific model. In this example, the `deepseek` transformer is applied to all models, and an additional `tooluse` transformer is applied only to the `deepseek-chat` model.
- **Passing Options to a Transformer**: Some transformers, like `maxtoken`, accept options. To pass options, use a nested array where the first element is the transformer name and the second is an options object.
```json
{
"name": "siliconflow",
"api_base_url": "https://api.siliconflow.cn/v1/chat/completions",
"api_key": "sk-xxx",
"models": ["moonshotai/Kimi-K2-Instruct"],
"transformer": {
"use": [
[
"maxtoken",
{
"max_tokens": 16384
}
]
]
}
```json
{
"name": "deepseek",
"api_base_url": "https://api.deepseek.com/chat/completions",
"api_key": "sk-xxx",
"models": ["deepseek-chat", "deepseek-reasoner"],
"transformer": {
"use": ["deepseek"],
"deepseek-chat": { "use": ["tooluse"] }
}
```
}
```
- **Passing Options to a Transformer**: Some transformers, like `maxtoken`, accept options. To pass options, use a nested array where the first element is the transformer name and the second is an options object.
```json
{
"name": "siliconflow",
"api_base_url": "https://api.siliconflow.cn/v1/chat/completions",
"api_key": "sk-xxx",
"models": ["moonshotai/Kimi-K2-Instruct"],
"transformer": {
"use": [
[
"maxtoken",
{
"max_tokens": 16384
}
]
]
}
}
```
**Available Built-in Transformers:**
- `deepseek`: Adapts requests/responses for DeepSeek API.
- `gemini`: Adapts requests/responses for Gemini API.
- `openrouter`: Adapts requests/responses for OpenRouter API.
- `groq`: Adapts requests/responses for groq API.
- `maxtoken`: Sets a specific `max_tokens` value.
- `tooluse`: Optimizes tool usage for certain models via `tool_choice`.
- `gemini-cli` (experimental): Unofficial support for Gemini via Gemini CLI [gemini-cli.js](https://gist.github.com/musistudio/1c13a65f35916a7ab690649d3df8d1cd).
- `deepseek`: Adapts requests/responses for DeepSeek API.
- `gemini`: Adapts requests/responses for Gemini API.
- `openrouter`: Adapts requests/responses for OpenRouter API.
- `groq`: Adapts requests/responses for groq API.
- `maxtoken`: Sets a specific `max_tokens` value.
- `tooluse`: Optimizes tool usage for certain models via `tool_choice`.
- `gemini-cli` (experimental): Unofficial support for Gemini via Gemini CLI [gemini-cli.js](https://gist.github.com/musistudio/1c13a65f35916a7ab690649d3df8d1cd).
**Custom Transformers:**
@@ -238,12 +242,12 @@ You can also create your own transformers and load them via the `transformers` f
```json
{
"transformers": [
{
"path": "$HOME/.claude-code-router/plugins/gemini-cli.js",
"options": {
"project": "xxx"
}
{
"path": "$HOME/.claude-code-router/plugins/gemini-cli.js",
"options": {
"project": "xxx"
}
}
]
}
```
@@ -252,12 +256,11 @@ You can also create your own transformers and load them via the `transformers` f
The `Router` object defines which model to use for different scenarios:
- `default`: The default model for general tasks.
- `background`: A model for background tasks. This can be a smaller, local model to save costs.
- `think`: A model for reasoning-heavy tasks, like Plan Mode.
- `longContext`: A model for handling long contexts (e.g., > 60K tokens).
- `webSearch`: Used for handling web search tasks and this requires the model itself to support the feature. If you're using openrouter, you need to add the `:online` suffix after the model name.
- `default`: The default model for general tasks.
- `background`: A model for background tasks. This can be a smaller, local model to save costs.
- `think`: A model for reasoning-heavy tasks, like Plan Mode.
- `longContext`: A model for handling long contexts (e.g., > 60K tokens).
- `webSearch`: Used for handling web search tasks and this requires the model itself to support the feature. If you're using openrouter, you need to add the `:online` suffix after the model name.
You can also switch models dynamically in Claude Code with the `/model` command:
`/model provider_name,model_name`
@@ -290,11 +293,11 @@ Here is an example of a `custom-router.js` based on `custom-router.example.js`:
* @returns {Promise<string|null>} - A promise that resolves to the "provider,model_name" string, or null to use the default router.
*/
module.exports = async function router(req, config) {
const userMessage = req.body.messages.find(m => m.role === 'user')?.content;
const userMessage = req.body.messages.find((m) => m.role === "user")?.content;
if (userMessage && userMessage.includes('explain this code')) {
if (userMessage && userMessage.includes("explain this code")) {
// Use a powerful model for code explanation
return 'openrouter,anthropic/claude-3.5-sonnet';
return "openrouter,anthropic/claude-3.5-sonnet";
}
// Fallback to the default router configuration
@@ -302,7 +305,6 @@ module.exports = async function router(req, config) {
};
```
## 🤖 GitHub Actions
Integrate Claude Code Router into your CI/CD pipeline. After setting up [Claude Code Actions](https://docs.anthropic.com/en/docs/claude-code/github-actions), modify your `.github/workflows/claude.yaml` to use the router:
@@ -364,8 +366,8 @@ This setup allows for interesting automations, like running tasks during off-pea
## 📝 Further Reading
- [Project Motivation and How It Works](blog/en/project-motivation-and-how-it-works.md)
- [Maybe We Can Do More with the Router](blog/en/maybe-we-can-do-more-with-the-route.md)
- [Project Motivation and How It Works](blog/en/project-motivation-and-how-it-works.md)
- [Maybe We Can Do More with the Router](blog/en/maybe-we-can-do-more-with-the-route.md)
## ❤️ Support & Sponsoring
@@ -387,39 +389,38 @@ A huge thank you to all our sponsors for their generous support!
- @Simon Leischnig
- [@duanshuaimin](https://github.com/duanshuaimin)
- [@vrgitadmin](https://github.com/vrgitadmin)
- @*o
- @\*o
- [@ceilwoo](https://github.com/ceilwoo)
- @*说
- @*更
- @K*g
- @R*R
- @\*说
- @\*更
- @K\*g
- @R\*R
- [@bobleer](https://github.com/bobleer)
- @*苗
- @*划
- @\*苗
- @\*划
- [@Clarence-pan](https://github.com/Clarence-pan)
- [@carter003](https://github.com/carter003)
- @S*r
- @*晖
- @*敏
- @Z*z
- @*然
- @S\*r
- @\*晖
- @\*敏
- @Z\*z
- @\*然
- [@cluic](https://github.com/cluic)
- @*苗
- @\*苗
- [@PromptExpert](https://github.com/PromptExpert)
- @*应
- @\*应
- [@yusnake](https://github.com/yusnake)
- @*飞
- @董*
- @*汀
- @*涯
- @*:-
- @**磊
- @*琢
- @*成
- @Z*o
- @*琨
- @\*飞
- @董\*
- @\*汀
- @\*涯
- @\*:-
- @\*\*磊
- @\*琢
- @\*成
- @Z\*o
- @\*琨
- [@congzhangzh](https://github.com/congzhangzh)
- @*_
- @\*\_
(If your name is masked, please contact me via my homepage email to update it with your GitHub username.)

View File

@@ -109,9 +109,10 @@ npm install -g @musistudio/claude-code-router
[
"maxtoken",
{
"max_tokens": 8192
"max_tokens": 65536
}
]
],
"enhancetool"
]
}
},
@@ -125,9 +126,10 @@ npm install -g @musistudio/claude-code-router
[
"maxtoken",
{
"max_tokens": 8192
"max_tokens": 65536
}
]
],
"enhancetool"
]
}
}

View File

@@ -76,9 +76,10 @@
[
"maxtoken",
{
"max_tokens": 8192
"max_tokens": 65536
}
]
],
"enhancetool"
]
}
},
@@ -92,9 +93,10 @@
[
"maxtoken",
{
"max_tokens": 8192
"max_tokens": 65536
}
]
],
"enhancetool"
]
}
}