Merge pull request #172 from eyaltoledano/adjust-context-window

chore(ai): Reduces context window back from 128k to 64k

We'll bump it back up when the better ai model management is implemented.
This commit is contained in:
Eyal Toledano
2025-04-11 14:42:25 -04:00
committed by GitHub
8 changed files with 22 additions and 23 deletions

View File

@@ -37,7 +37,6 @@
- Add additional binary alias: `task-master-mcp-server` pointing to the same MCP server script
- **Significant improvements to model configuration:**
- Increase context window from 64k to 128k tokens (MAX_TOKENS=128000) for handling larger codebases
- Reduce temperature from 0.4 to 0.2 for more consistent, deterministic outputs
- Set default model to "claude-3-7-sonnet-20250219" in configuration
- Update Perplexity model to "sonar-pro" for research operations