From 4d81734cebffaf0025e84ea5f4cb7afc050d2eba Mon Sep 17 00:00:00 2001 From: musi Date: Tue, 15 Jul 2025 22:37:29 +0800 Subject: [PATCH] update doc --- README.md | 80 +++++++++++++++++++++++++++++++++++++++++++++++++++++-- 1 file changed, 78 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 1862831..843fd46 100644 --- a/README.md +++ b/README.md @@ -24,7 +24,7 @@ npm install -g @musistudio/claude-code-router ccr code ``` -4. Configure routing[optional] +4. Configure routing Set up your `~/.claude-code-router/config.json` file like this: ```json @@ -113,6 +113,79 @@ ccr code `/model openrouter,anthropic/claude-3.5-sonnet` This will use the anthropic/claude-3.5-sonnet model provided by OpenRouter to handle all subsequent tasks. +5. About transformer +`transformer` is used to convert requests and responses for different vendors. For different vendors, we can configure different transformers. + +For example, in the following case, we use the `openrouter` transformer for the OpenRouter vendor. This transformer removes the `cache_control` parameter (mainly used to adapt Claude's prompt cache) from the request for models other than Claude. In the response, it adapts the reasoning field. +```json +{ + "name": "openrouter", + "api_base_url": "https://openrouter.ai/api/v1/chat/completions", + "api_key": "", + "models": [ + "google/gemini-2.5-pro-preview", + "anthropic/claude-sonnet-4", + "anthropic/claude-3.5-sonnet", + "anthropic/claude-3.7-sonnet:thinking", + "deepseek/deepseek-chat-v3-0324" + ], + "transformer": { + "use": [ + "openrouter" + ] + } +} +``` +You can also configure transformers for different models of the same vendor. For instance, in the following example, we use the `deepseek` transformer for the DeepSeek vendor. This transformer sets the maximum value of `max_tokens` to `8192` in the request, and in the response, it adapts the `reasoning_content` field. Additionally, for the `deepseek-chat` model, we use the `tooluse` transformer, which optimizes the tool call for the `deepseek-v3` model using the `tool_choice` parameter (mainly because deepseek-r1 does not support the tool_choice parameter). +```json +{ + "name": "deepseek", + "api_base_url": "https://api.deepseek.com/chat/completions", + "api_key": "", + "models": [ + "deepseek-chat", + "deepseek-reasoner" + ], + "transformer": { + "use": [ + "deepseek" + ], + "deepseek-chat": { + "use": [ + "tooluse" + ] + } + } +} +``` +Currently, the following transformers are available: + +- deepseek + +- gemini + +- maxtoken + +- openrouter + +- tooluse + +- gemini-cli (experimental, unofficial support: https://gist.github.com/musistudio/1c13a65f35916a7ab690649d3df8d1cd) + +You can configure custom transformers in the `config.json` file using the `transformers` field, for example: +```json +{ + "transformers": [ + { + "path": "$HOME/.claude-code-router/plugins/gemini-cli.js", + "options": { + "project": "xxx" + } + } + ] +} +``` + ## Features - [x] Support change models @@ -249,4 +322,7 @@ Thanks to the following sponsors for supporting the continued development of thi @\*敏 (可通过主页邮箱联系我修改 github 用户名) @Z\*z (可通过主页邮箱联系我修改 github 用户名) @\*然 (可通过主页邮箱联系我修改 github 用户名) -@\*浩 (可通过主页邮箱联系我修改 github 用户名) \ No newline at end of file +[@cluic](https://github.com/cluic) +@\*苗 (可通过主页邮箱联系我修改 github 用户名) +[@PromptExpert](https://github.com/PromptExpert) +@\*应 (可通过主页邮箱联系我修改 github 用户名)