From ca1b9a5fbaf9469c284c0ab69c72ece4c24821cd Mon Sep 17 00:00:00 2001 From: "jinhui.li" Date: Mon, 16 Jun 2025 13:04:43 +0800 Subject: [PATCH] update README --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 798a254..cdd7cd0 100644 --- a/README.md +++ b/README.md @@ -102,7 +102,7 @@ Note: claude code consumes a huge amount of tokens, but thanks to DeepSeek’s l Some interesting points: Based on my testing, including a lot of context information can help narrow the performance gap between these LLM models. For instance, when I used Claude-4 in VSCode Copilot to handle a Flutter issue, it messed up the files in three rounds of conversation, and I had to roll everything back. However, when I used claude code with DeepSeek, after three or four rounds of conversation, I finally managed to complete my task—and the cost was less than 1 RMB! ## Some articles: -1. [Project Motivation and Principles](blog/en/project-motivation-and-how-it-works.md) ([中文版看这里](blog/zh/project-motivation-and-how-it-works.md)) +1. [Project Motivation and Principles](blog/en/project-motivation-and-how-it-works.md) ([中文版看这里](blog/zh/项目初衷及原理.md)) ## Buy me a coffee If you find this project helpful, you can choose to sponsor the author with a cup of coffee.