chore(ai): Reduces context window back from 128k to 64k until we decouple context windows between main and research models.

This commit is contained in:
Eyal Toledano
2025-04-11 13:33:02 -04:00
parent 678262df22
commit db5ebe93c3
4 changed files with 3 additions and 4 deletions

View File

@@ -37,7 +37,6 @@
- Add additional binary alias: `task-master-mcp-server` pointing to the same MCP server script
- **Significant improvements to model configuration:**
- Increase context window from 64k to 128k tokens (MAX_TOKENS=128000) for handling larger codebases
- Reduce temperature from 0.4 to 0.2 for more consistent, deterministic outputs
- Set default model to "claude-3-7-sonnet-20250219" in configuration
- Update Perplexity model to "sonar-pro" for research operations

View File

@@ -5,7 +5,7 @@ PERPLEXITY_API_KEY=your_perplexity_api_key_here # Format: pplx-...
# Model Configuration
MODEL=claude-3-7-sonnet-20250219 # Recommended models: claude-3-7-sonnet-20250219, claude-3-opus-20240229
PERPLEXITY_MODEL=sonar-pro # Perplexity model for research-backed subtasks
MAX_TOKENS=128000 # Maximum tokens for model responses
MAX_TOKENS=64000 # Maximum tokens for model responses
TEMPERATURE=0.2 # Temperature for model responses (0.0-1.0)
# Logging Configuration

View File

@@ -33,7 +33,7 @@ MCP (Model Control Protocol) provides the easiest way to get started with Task M
"PERPLEXITY_API_KEY": "YOUR_PERPLEXITY_API_KEY_HERE",
"MODEL": "claude-3-7-sonnet-20250219",
"PERPLEXITY_MODEL": "sonar-pro",
"MAX_TOKENS": 128000,
"MAX_TOKENS": 64000,
"TEMPERATURE": 0.2,
"DEFAULT_SUBTASKS": 5,
"DEFAULT_PRIORITY": "medium"

View File

@@ -23,7 +23,7 @@ MCP (Model Control Protocol) provides the easiest way to get started with Task M
"PERPLEXITY_API_KEY": "YOUR_PERPLEXITY_API_KEY_HERE",
"MODEL": "claude-3-7-sonnet-20250219",
"PERPLEXITY_MODEL": "sonar-pro",
"MAX_TOKENS": 128000,
"MAX_TOKENS": 64000,
"TEMPERATURE": 0.2,
"DEFAULT_SUBTASKS": 5,
"DEFAULT_PRIORITY": "medium"