diff --git a/README.md b/README.md
index b657c6a..d9b296f 100644
--- a/README.md
+++ b/README.md
@@ -1,94 +1,145 @@
# Claude Code Router
-> This is a tool for routing Claude Code requests to different models, and you can customize any request.
+[中文版](README_zh.md)
+
+> A powerful tool to route Claude Code requests to different models and customize any request.

-## Usage
+## ✨ Features
-1. Install Claude Code
+- **Model Routing**: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
+- **Multi-Provider Support**: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
+- **Request/Response Transformation**: Customize requests and responses for different providers using transformers.
+- **Dynamic Model Switching**: Switch models on-the-fly within Claude Code using the `/model` command.
+- **GitHub Actions Integration**: Trigger Claude Code tasks in your GitHub workflows.
+- **Plugin System**: Extend functionality with custom transformers.
+
+## 🚀 Getting Started
+
+### 1. Installation
+
+First, ensure you have [Claude Code](https://docs.anthropic.com/en/docs/claude-code/quickstart) installed:
```shell
npm install -g @anthropic-ai/claude-code
```
-2. Install Claude Code Router
+Then, install Claude Code Router:
```shell
npm install -g @musistudio/claude-code-router
```
-3. Start Claude Code by claude-code-router
+### 2. Configuration
-```shell
-ccr code
-```
+Create and configure your `~/.claude-code-router/config.json` file. For more details, you can refer to `config.example.json`.
-4. Configure routing
- Set up your `~/.claude-code-router/config.json` file like this:
+The `config.json` file has several key sections:
+- **`PROXY_URL`** (optional): You can set a proxy for API requests, for example: `"PROXY_URL": "http://127.0.0.1:7890"`.
+- **`LOG`** (optional): You can enable logging by setting it to `true`. The log file will be located at `$HOME/.claude-code-router.log`.
+- **`Providers`**: Used to configure different model providers.
+- **`Router`**: Used to set up routing rules. `default` specifies the default model, which will be used for all requests if no other route is configured.
+
+Here is a comprehensive example:
```json
{
+ "PROXY_URL": "http://127.0.0.1:7890",
+ "LOG": true,
"Providers": [
{
"name": "openrouter",
- // IMPORTANT: api_base_url must be a complete (full) URL.
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-xxx",
"models": [
"google/gemini-2.5-pro-preview",
"anthropic/claude-sonnet-4",
- "anthropic/claude-3.5-sonnet",
- "anthropic/claude-3.7-sonnet:thinking"
+ "anthropic/claude-3.5-sonnet"
],
- "transformer": {
- "use": ["openrouter"]
- }
+ "transformer": { "use": ["openrouter"] }
},
{
"name": "deepseek",
- // IMPORTANT: api_base_url must be a complete (full) URL.
"api_base_url": "https://api.deepseek.com/chat/completions",
"api_key": "sk-xxx",
"models": ["deepseek-chat", "deepseek-reasoner"],
"transformer": {
"use": ["deepseek"],
- "deepseek-chat": {
- // Enhance tool usage for the deepseek-chat model using the ToolUse transformer.
- "use": ["tooluse"]
- }
+ "deepseek-chat": { "use": ["tooluse"] }
}
},
{
"name": "ollama",
- // IMPORTANT: api_base_url must be a complete (full) URL.
"api_base_url": "http://localhost:11434/v1/chat/completions",
"api_key": "ollama",
"models": ["qwen2.5-coder:latest"]
- },
- {
- "name": "gemini",
- // IMPORTANT: api_base_url must be a complete (full) URL.
- "api_base_url": "https://generativelanguage.googleapis.com/v1beta/models/",
- "api_key": "sk-xxx",
- "models": ["gemini-2.5-flash", "gemini-2.5-pro"],
- "transformer": {
- "use": ["gemini"]
- }
- },
- {
- "name": "volcengine",
- // IMPORTANT: api_base_url must be a complete (full) URL.
- "api_base_url": "https://ark.cn-beijing.volces.com/api/v3/chat/completions",
- "api_key": "sk-xxx",
- "models": ["deepseek-v3-250324", "deepseek-r1-250528"],
- "transformer": {
- "use": ["deepseek"]
- }
- },
+ }
+ ],
+ "Router": {
+ "default": "deepseek,deepseek-chat",
+ "background": "ollama,qwen2.5-coder:latest",
+ "think": "deepseek,deepseek-reasoner",
+ "longContext": "openrouter,google/gemini-2.5-pro-preview"
+ }
+}
+```
+
+
+### 3. Running Claude Code with the Router
+
+Start Claude Code using the router:
+
+```shell
+ccr code
+```
+
+#### Providers
+
+The `Providers` array is where you define the different model providers you want to use. Each provider object requires:
+
+- `name`: A unique name for the provider.
+- `api_base_url`: The full API endpoint for chat completions.
+- `api_key`: Your API key for the provider.
+- `models`: A list of model names available from this provider.
+- `transformer` (optional): Specifies transformers to process requests and responses.
+
+#### Transformers
+
+Transformers allow you to modify the request and response payloads to ensure compatibility with different provider APIs.
+
+- **Global Transformer**: Apply a transformer to all models from a provider. In this example, the `openrouter` transformer is applied to all models under the `openrouter` provider.
+ ```json
+ {
+ "name": "openrouter",
+ "api_base_url": "https://openrouter.ai/api/v1/chat/completions",
+ "api_key": "sk-xxx",
+ "models": [
+ "google/gemini-2.5-pro-preview",
+ "anthropic/claude-sonnet-4",
+ "anthropic/claude-3.5-sonnet"
+ ],
+ "transformer": { "use": ["openrouter"] }
+ }
+ ```
+- **Model-Specific Transformer**: Apply a transformer to a specific model. In this example, the `deepseek` transformer is applied to all models, and an additional `tooluse` transformer is applied only to the `deepseek-chat` model.
+ ```json
+ {
+ "name": "deepseek",
+ "api_base_url": "https://api.deepseek.com/chat/completions",
+ "api_key": "sk-xxx",
+ "models": ["deepseek-chat", "deepseek-reasoner"],
+ "transformer": {
+ "use": ["deepseek"],
+ "deepseek-chat": { "use": ["tooluse"] }
+ }
+ }
+ ```
+
+- **Passing Options to a Transformer**: Some transformers, like `maxtoken`, accept options. To pass options, use a nested array where the first element is the transformer name and the second is an options object.
+ ```json
{
"name": "siliconflow",
- // IMPORTANT: api_base_url must be a complete (full) URL.
"api_base_url": "https://api.siliconflow.cn/v1/chat/completions",
"api_key": "sk-xxx",
"models": ["moonshotai/Kimi-K2-Instruct"],
@@ -97,99 +148,27 @@ ccr code
[
"maxtoken",
{
- "max_tokens": 16384 // for siliconflow max_tokens
+ "max_tokens": 16384
}
]
]
}
}
- ],
- "Router": {
- "default": "deepseek,deepseek-chat", // IMPORTANT OPENAI_MODEL has been deprecated
- "background": "ollama,qwen2.5-coder:latest",
- "think": "deepseek,deepseek-reasoner",
- "longContext": "openrouter,google/gemini-2.5-pro-preview"
- }
-}
-```
+ ```
-- `background`
- This model will be used to handle some background tasks([background-token-usage](https://docs.anthropic.com/en/docs/claude-code/costs#background-token-usage)). Based on my tests, it doesn’t require high intelligence. I’m using the qwen-coder-2.5:7b model running locally on my MacBook Pro M1 (32GB) via Ollama.
- If your computer can’t run Ollama, you can also use some free models, such as qwen-coder-2.5:3b.
+**Available Built-in Transformers:**
-- `think`
- This model will be used when enabling Claude Code to perform reasoning. However, reasoning budget control has not yet been implemented (since the DeepSeek-R1 model does not support it), so there is currently no difference between using UltraThink and Think modes.
- It is worth noting that Plan Mode also use this model to achieve better planning results.
- Note: The reasoning process via the official DeepSeek API may be very slow, so you may need to wait for an extended period of time.
+- `deepseek`: Adapts requests/responses for DeepSeek API.
+- `gemini`: Adapts requests/responses for Gemini API.
+- `maxtoken`: Sets a specific `max_tokens` value.
+- `openrouter`: Adapts requests/responses for OpenRouter API.
+- `tooluse`: Optimizes tool usage for certain models.
+- `gemini-cli` (experimental): Unofficial support for Gemini via Gemini CLI [gemini-cli.js](https://gist.github.com/musistudio/1c13a65f35916a7ab690649d3df8d1cd).
-- `longContext`
- This model will be used when the context length exceeds 32K (this value may be modified in the future). You can route the request to a model that performs well with long contexts (I’ve chosen google/gemini-2.5-pro-preview). This scenario has not been thoroughly tested yet, so if you encounter any issues, please submit an issue.
+**Custom Transformers:**
-- model command
- You can also switch models within Claude Code by using the `/model` command. The format is: `provider,model`, like this:
- `/model openrouter,anthropic/claude-3.5-sonnet`
- This will use the anthropic/claude-3.5-sonnet model provided by OpenRouter to handle all subsequent tasks.
+You can also create your own transformers and load them via the `transformers` field in `config.json`.
-5. About transformer
-`transformer` is used to convert requests and responses for different vendors. For different vendors, we can configure different transformers.
-
-For example, in the following case, we use the `openrouter` transformer for the OpenRouter vendor. This transformer removes the `cache_control` parameter (mainly used to adapt Claude's prompt cache) from the request for models other than Claude. In the response, it adapts the reasoning field.
-```json
-{
- "name": "openrouter",
- "api_base_url": "https://openrouter.ai/api/v1/chat/completions",
- "api_key": "",
- "models": [
- "google/gemini-2.5-pro-preview",
- "anthropic/claude-sonnet-4",
- "anthropic/claude-3.5-sonnet",
- "anthropic/claude-3.7-sonnet:thinking",
- "deepseek/deepseek-chat-v3-0324"
- ],
- "transformer": {
- "use": [
- "openrouter"
- ]
- }
-}
-```
-You can also configure transformers for different models of the same vendor. For instance, in the following example, we use the `deepseek` transformer for the DeepSeek vendor. This transformer sets the maximum value of `max_tokens` to `8192` in the request, and in the response, it adapts the `reasoning_content` field. Additionally, for the `deepseek-chat` model, we use the `tooluse` transformer, which optimizes the tool call for the `deepseek-v3` model using the `tool_choice` parameter (mainly because deepseek-r1 does not support the tool_choice parameter).
-```json
-{
- "name": "deepseek",
- "api_base_url": "https://api.deepseek.com/chat/completions",
- "api_key": "",
- "models": [
- "deepseek-chat",
- "deepseek-reasoner"
- ],
- "transformer": {
- "use": [
- "deepseek"
- ],
- "deepseek-chat": {
- "use": [
- "tooluse"
- ]
- }
- }
-}
-```
-Currently, the following transformers are available:
-
-- deepseek
-
-- gemini
-
-- maxtoken
-
-- openrouter
-
-- tooluse
-
-- gemini-cli (experimental, unofficial support: https://gist.github.com/musistudio/1c13a65f35916a7ab690649d3df8d1cd)
-
-You can configure custom transformers in the `config.json` file using the `transformers` field, for example:
```json
{
"transformers": [
@@ -203,17 +182,23 @@ You can configure custom transformers in the `config.json` file using the `trans
}
```
-## Features
+#### Router
-- [x] Support change models
-- [x] Github Actions
-- [ ] More detailed logs
-- [ ] Support image
-- [ ] Support web search
+The `Router` object defines which model to use for different scenarios:
-## Github Actions
+- `default`: The default model for general tasks.
+- `background`: A model for background tasks. This can be a smaller, local model to save costs.
+- `think`: A model for reasoning-heavy tasks, like Plan Mode.
+- `longContext`: A model for handling long contexts (e.g., > 60K tokens).
-You just need to install `Claude Code Actions` in your repository according to the [official documentation](https://docs.anthropic.com/en/docs/claude-code/github-actions). For `ANTHROPIC_API_KEY`, you can use any string. Then, modify your `.github/workflows/claude.yaml` file to include claude-code-router, like this:
+You can also switch models dynamically in Claude Code with the `/model` command:
+`/model provider_name,model_name`
+Example: `/model openrouter,anthropic/claude-3.5-sonnet`
+
+
+## 🤖 GitHub Actions
+
+Integrate Claude Code Router into your CI/CD pipeline. After setting up [Claude Code Actions](https://docs.anthropic.com/en/docs/claude-code/github-actions), modify your `.github/workflows/claude.yaml` to use the router:
```yaml
name: Claude Code
@@ -221,20 +206,13 @@ name: Claude Code
on:
issue_comment:
types: [created]
- pull_request_review_comment:
- types: [created]
- issues:
- types: [opened, assigned]
- pull_request_review:
- types: [submitted]
+ # ... other triggers
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
- (github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
- (github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
- (github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
+ # ... other conditions
runs-on: ubuntu-latest
permissions:
contents: read
@@ -272,75 +250,58 @@ jobs:
env:
ANTHROPIC_BASE_URL: http://localhost:3456
with:
- anthropic_api_key: "test"
+ anthropic_api_key: "any-string-is-ok"
```
-You can modify the contents of `$HOME/.claude-code-router/config.json` as needed.
-GitHub Actions support allows you to trigger Claude Code at specific times, which opens up some interesting possibilities.
+This setup allows for interesting automations, like running tasks during off-peak hours to reduce API costs.
-For example, between 00:30 and 08:30 Beijing Time, using the official DeepSeek API:
+## 📝 Further Reading
-- The cost of the `deepseek-v3` model is only 50% of the normal time.
+- [Project Motivation and How It Works](blog/en/project-motivation-and-how-it-works.md)
+- [Maybe We Can Do More with the Router](blog/en/maybe-we-can-do-more-with-the-route.md)
-- The `deepseek-r1` model is just 25% of the normal time.
+## ❤️ Support & Sponsoring
-So maybe in the future, I’ll describe detailed tasks for Claude Code ahead of time and let it run during these discounted hours to reduce costs?
-
-## Some tips:
-
-Now you can use deepseek-v3 models directly without using any plugins.
-
-If you’re using the DeepSeek API provided by the official website, you might encounter an “exceeding context” error after several rounds of conversation (since the official API only supports a 64K context window). In this case, you’ll need to discard the previous context and start fresh. Alternatively, you can use ByteDance’s DeepSeek API, which offers a 128K context window and supports KV cache.
-
-
-
-Note: claude code consumes a huge amount of tokens, but thanks to DeepSeek’s low cost, you can use claude code at a fraction of Claude’s price, and you don’t need to subscribe to the Claude Max plan.
-
-Some interesting points: Based on my testing, including a lot of context information can help narrow the performance gap between these LLM models. For instance, when I used Claude-4 in VSCode Copilot to handle a Flutter issue, it messed up the files in three rounds of conversation, and I had to roll everything back. However, when I used claude code with DeepSeek, after three or four rounds of conversation, I finally managed to complete my task—and the cost was less than 1 RMB!
-
-## Some articles:
-
-1. [Project Motivation and Principles](blog/en/project-motivation-and-how-it-works.md) ([项目初衷及原理](blog/zh/项目初衷及原理.md))
-2. [Maybe We Can Do More with the Router](blog/en/maybe-we-can-do-more-with-the-route.md) ([或许我们能在 Router 中做更多事情](blog/zh/或许我们能在Router中做更多事情.md))
-
-## Buy me a coffee
-
-If you find this project helpful, you can choose to sponsor the author with a cup of coffee. Please provide your GitHub information so I can add you to the sponsor list below.
+If you find this project helpful, please consider sponsoring its development. Your support is greatly appreciated!
[](https://ko-fi.com/F1F31GN2GM)
-## Sponsors
+### Our Sponsors
-Thanks to the following sponsors for supporting the continued development of this project:
+A huge thank you to all our sponsors for their generous support!
-@Simon Leischnig (If you see this, feel free to contact me and I can update it with your GitHub information)
-[@duanshuaimin](https://github.com/duanshuaimin)
-[@vrgitadmin](https://github.com/vrgitadmin)
-@\*o (可通过主页邮箱联系我修改 github 用户名)
-[@ceilwoo](https://github.com/ceilwoo)
-@\*说 (可通过主页邮箱联系我修改 github 用户名)
-@\*更 (可通过主页邮箱联系我修改 github 用户名)
-@K\*g (可通过主页邮箱联系我修改 github 用户名)
-@R\*R (可通过主页邮箱联系我修改 github 用户名)
-[@bobleer](https://github.com/bobleer)
-@\*苗 (可通过主页邮箱联系我修改 github 用户名)
-@\*划 (可通过主页邮箱联系我修改 github 用户名)
-[@Clarence-pan](https://github.com/Clarence-pan)
-[@carter003](https://github.com/carter003)
-@S\*r (可通过主页邮箱联系我修改 github 用户名)
-@\*晖 (可通过主页邮箱联系我修改 github 用户名)
-@\*敏 (可通过主页邮箱联系我修改 github 用户名)
-@Z\*z (可通过主页邮箱联系我修改 github 用户名)
-@\*然 (可通过主页邮箱联系我修改 github 用户名)
-[@cluic](https://github.com/cluic)
-@\*苗 (可通过主页邮箱联系我修改 github 用户名)
-[@PromptExpert](https://github.com/PromptExpert)
-@\*应 (可通过主页邮箱联系我修改 github 用户名)
-[@yusnake](https://github.com/yusnake)
\ No newline at end of file
+- @Simon Leischnig
+- [@duanshuaimin](https://github.com/duanshuaimin)
+- [@vrgitadmin](https://github.com/vrgitadmin)
+- @*o
+- [@ceilwoo](https://github.com/ceilwoo)
+- @*说
+- @*更
+- @K*g
+- @R*R
+- [@bobleer](https://github.com/bobleer)
+- @*苗
+- @*划
+- [@Clarence-pan](https://github.com/Clarence-pan)
+- [@carter003](https://github.com/carter003)
+- @S*r
+- @*晖
+- @*敏
+- @Z*z
+- @*然
+- [@cluic](https://github.com/cluic)
+- @*苗
+- [@PromptExpert](https://github.com/PromptExpert)
+- @*应
+- [@yusnake](https://github.com/yusnake)
+- @*飞
+- @董*
+
+(If your name is masked, please contact me via my homepage email to update it with your GitHub username.)
diff --git a/README_zh.md b/README_zh.md
new file mode 100644
index 0000000..e066ee1
--- /dev/null
+++ b/README_zh.md
@@ -0,0 +1,305 @@
+# Claude Code Router
+
+> 一款强大的工具,可将 Claude Code 请求路由到不同的模型,并自定义任何请求。
+
+
+
+## ✨ 功能
+
+- **模型路由**: 根据您的需求将请求路由到不同的模型(例如,后台任务、思考、长上下文)。
+- **多提供商支持**: 支持 OpenRouter、DeepSeek、Ollama、Gemini、Volcengine 和 SiliconFlow 等各种模型提供商。
+- **请求/响应转换**: 使用转换器为不同的提供商自定义请求和响应。
+- **动态模型切换**: 在 Claude Code 中使用 `/model` 命令动态切换模型。
+- **GitHub Actions 集成**: 在您的 GitHub 工作流程中触发 Claude Code 任务。
+- **插件系统**: 使用自定义转换器扩展功能。
+
+## 🚀 快速入门
+
+### 1. 安装
+
+首先,请确保您已安装 [Claude Code](https://docs.anthropic.com/en/docs/claude-code/quickstart):
+
+```shell
+npm install -g @anthropic-ai/claude-code
+```
+
+然后,安装 Claude Code Router:
+
+```shell
+npm install -g @musistudio/claude-code-router
+```
+
+### 2. 配置
+
+创建并配置您的 `~/.claude-code-router/config.json` 文件。有关更多详细信息,您可以参考 `config.example.json`。
+
+`config.json` 文件有几个关键部分:
+- **`PROXY_URL`** (可选): 您可以为 API 请求设置代理,例如:`"PROXY_URL": "http://127.0.0.1:7890"`。
+- **`LOG`** (可选): 您可以通过将其设置为 `true` 来启用日志记录。日志文件将位于 `$HOME/.claude-code-router.log`。
+- **`Providers`**: 用于配置不同的模型提供商。
+- **`Router`**: 用于设置路由规则。`default` 指定默认模型,如果未配置其他路由,则该模型将用于所有请求。
+
+这是一个综合示例:
+
+```json
+{
+ "PROXY_URL": "http://127.0.0.1:7890",
+ "LOG": true,
+ "Providers": [
+ {
+ "name": "openrouter",
+ "api_base_url": "https://openrouter.ai/api/v1/chat/completions",
+ "api_key": "sk-xxx",
+ "models": [
+ "google/gemini-2.5-pro-preview",
+ "anthropic/claude-sonnet-4",
+ "anthropic/claude-3.5-sonnet"
+ ],
+ "transformer": { "use": ["openrouter"] }
+ },
+ {
+ "name": "deepseek",
+ "api_base_url": "https://api.deepseek.com/chat/completions",
+ "api_key": "sk-xxx",
+ "models": ["deepseek-chat", "deepseek-reasoner"],
+ "transformer": {
+ "use": ["deepseek"],
+ "deepseek-chat": { "use": ["tooluse"] }
+ }
+ },
+ {
+ "name": "ollama",
+ "api_base_url": "http://localhost:11434/v1/chat/completions",
+ "api_key": "ollama",
+ "models": ["qwen2.5-coder:latest"]
+ }
+ ],
+ "Router": {
+ "default": "deepseek,deepseek-chat",
+ "background": "ollama,qwen2.5-coder:latest",
+ "think": "deepseek,deepseek-reasoner",
+ "longContext": "openrouter,google/gemini-2.5-pro-preview"
+ }
+}
+```
+
+
+### 3. 使用 Router 运行 Claude Code
+
+使用 router 启动 Claude Code:
+
+```shell
+ccr code
+```
+
+#### Providers
+
+`Providers` 数组是您定义要使用的不同模型提供商的地方。每个提供商对象都需要:
+
+- `name`: 提供商的唯一名称。
+- `api_base_url`: 聊天补全的完整 API 端点。
+- `api_key`: 您提供商的 API 密钥。
+- `models`: 此提供商可用的模型名称列表。
+- `transformer` (可选): 指定用于处理请求和响应的转换器。
+
+#### Transformers
+
+Transformers 允许您修改请求和响应负载,以确保与不同提供商 API 的兼容性。
+
+- **全局 Transformer**: 将转换器应用于提供商的所有模型。在此示例中,`openrouter` 转换器将应用于 `openrouter` 提供商下的所有模型。
+ ```json
+ {
+ "name": "openrouter",
+ "api_base_url": "https://openrouter.ai/api/v1/chat/completions",
+ "api_key": "sk-xxx",
+ "models": [
+ "google/gemini-2.5-pro-preview",
+ "anthropic/claude-sonnet-4",
+ "anthropic/claude-3.5-sonnet"
+ ],
+ "transformer": { "use": ["openrouter"] }
+ }
+ ```
+- **特定于模型的 Transformer**: 将转换器应用于特定模型。在此示例中,`deepseek` 转换器应用于所有模型,而额外的 `tooluse` 转换器仅应用于 `deepseek-chat` 模型。
+ ```json
+ {
+ "name": "deepseek",
+ "api_base_url": "https://api.deepseek.com/chat/completions",
+ "api_key": "sk-xxx",
+ "models": ["deepseek-chat", "deepseek-reasoner"],
+ "transformer": {
+ "use": ["deepseek"],
+ "deepseek-chat": { "use": ["tooluse"] }
+ }
+ }
+ ```
+
+- **向 Transformer 传递选项**: 某些转换器(如 `maxtoken`)接受选项。要传递选项,请使用嵌套数组,其中第一个元素是转换器名称,第二个元素是选项对象。
+ ```json
+ {
+ "name": "siliconflow",
+ "api_base_url": "https://api.siliconflow.cn/v1/chat/completions",
+ "api_key": "sk-xxx",
+ "models": ["moonshotai/Kimi-K2-Instruct"],
+ "transformer": {
+ "use": [
+ [
+ "maxtoken",
+ {
+ "max_tokens": 16384
+ }
+ ]
+ ]
+ }
+ }
+ ```
+
+**可用的内置 Transformer:**
+
+- `deepseek`: 适配 DeepSeek API 的请求/响应。
+- `gemini`: 适配 Gemini API 的请求/响应。
+- `maxtoken`: 设置特定的 `max_tokens` 值。
+- `openrouter`: 适配 OpenRouter API 的请求/响应。
+- `tooluse`: 优化某些模型的工具使用。
+- `gemini-cli` (实验性): 通过 Gemini CLI [gemini-cli.js](https://gist.github.com/musistudio/1c13a65f35916a7ab690649d3df8d1cd) 对 Gemini 的非官方支持。
+
+**自定义 Transformer:**
+
+您还可以创建自己的转换器,并通过 `config.json` 中的 `transformers` 字段加载它们。
+
+```json
+{
+ "transformers": [
+ {
+ "path": "$HOME/.claude-code-router/plugins/gemini-cli.js",
+ "options": {
+ "project": "xxx"
+ }
+ }
+ ]
+}
+```
+
+#### Router
+
+`Router` 对象定义了在不同场景下使用哪个模型:
+
+- `default`: 用于常规任务的默认模型。
+- `background`: 用于后台任务的模型。这可以是一个较小的本地模型以节省成本。
+- `think`: 用于推理密集型任务(如计划模式)的模型。
+- `longContext`: 用于处理长上下文(例如,> 60K 令牌)的模型。
+
+您还可以使用 `/model` 命令在 Claude Code 中动态切换模型:
+`/model provider_name,model_name`
+示例: `/model openrouter,anthropic/claude-3.5-sonnet`
+
+
+## 🤖 GitHub Actions
+
+将 Claude Code Router 集成到您的 CI/CD 管道中。在设置 [Claude Code Actions](https://docs.anthropic.com/en/docs/claude-code/github-actions) 后,修改您的 `.github/workflows/claude.yaml` 以使用路由器:
+
+```yaml
+name: Claude Code
+
+on:
+ issue_comment:
+ types: [created]
+ # ... other triggers
+
+jobs:
+ claude:
+ if: |
+ (github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
+ # ... other conditions
+ runs-on: ubuntu-latest
+ permissions:
+ contents: read
+ pull-requests: read
+ issues: read
+ id-token: write
+ steps:
+ - name: Checkout repository
+ uses: actions/checkout@v4
+ with:
+ fetch-depth: 1
+
+ - name: Prepare Environment
+ run: |
+ curl -fsSL https://bun.sh/install | bash
+ mkdir -p $HOME/.claude-code-router
+ cat << 'EOF' > $HOME/.claude-code-router/config.json
+ {
+ "log": true,
+ "OPENAI_API_KEY": "${{ secrets.OPENAI_API_KEY }}",
+ "OPENAI_BASE_URL": "https://api.deepseek.com",
+ "OPENAI_MODEL": "deepseek-chat"
+ }
+ EOF
+ shell: bash
+
+ - name: Start Claude Code Router
+ run: |
+ nohup ~/.bun/bin/bunx @musistudio/claude-code-router@1.0.8 start &
+ shell: bash
+
+ - name: Run Claude Code
+ id: claude
+ uses: anthropics/claude-code-action@beta
+ env:
+ ANTHROPIC_BASE_URL: http://localhost:3456
+ with:
+ anthropic_api_key: "any-string-is-ok"
+```
+
+这种设置可以实现有趣的自动化,例如在非高峰时段运行任务以降低 API 成本。
+
+## 📝 深入阅读
+
+- [项目动机和工作原理](blog/zh/项目初衷及原理.md)
+- [也许我们可以用路由器做更多事情](blog/zh/或许我们能在Router中做更多事情.md)
+
+## ❤️ 支持与赞助
+
+如果您觉得这个项目有帮助,请考虑赞助它的开发。非常感谢您的支持!
+
+[](https://ko-fi.com/F1F31GN2GM)
+
+
+
+  |
+  |
+
+
+
+### 我们的赞助商
+
+非常感谢所有赞助商的慷慨支持!
+
+- @Simon Leischnig
+- [@duanshuaimin](https://github.com/duanshuaimin)
+- [@vrgitadmin](https://github.com/vrgitadmin)
+- @*o
+- [@ceilwoo](https://github.com/ceilwoo)
+- @*说
+- @*更
+- @K*g
+- @R*R
+- [@bobleer](https://github.com/bobleer)
+- @*苗
+- @*划
+- [@Clarence-pan](https://github.com/Clarence-pan)
+- [@carter003](https://github.com/carter003)
+- @S*r
+- @*晖
+- @*敏
+- @Z*z
+- @*然
+- [@cluic](https://github.com/cluic)
+- @*苗
+- [@PromptExpert](https://github.com/PromptExpert)
+- @*应
+- [@yusnake](https://github.com/yusnake)
+- @*飞
+- @董*
+
+(如果您的名字被屏蔽,请通过我的主页电子邮件与我联系,以便使用您的 GitHub 用户名进行更新。)
diff --git a/config.json b/config.json
deleted file mode 100644
index e01f33f..0000000
--- a/config.json
+++ /dev/null
@@ -1,8 +0,0 @@
-{
- "usePlugin": "",
- "LOG": true,
- "OPENAI_API_KEY": "",
- "OPENAI_BASE_URL": "",
- "OPENAI_MODEL": "",
- "modelProviders": {}
-}