chore(ai): Reduces context window back from 128k to 64k until we decouple context windows between main and research models.

This commit is contained in:
Eyal Toledano
2025-04-11 13:33:02 -04:00
parent d3d9dc6ebe
commit 9c0ed3c799
4 changed files with 3 additions and 4 deletions

View File

@@ -5,7 +5,7 @@ PERPLEXITY_API_KEY=your_perplexity_api_key_here # Format: pplx-...
# Model Configuration
MODEL=claude-3-7-sonnet-20250219 # Recommended models: claude-3-7-sonnet-20250219, claude-3-opus-20240229
PERPLEXITY_MODEL=sonar-pro # Perplexity model for research-backed subtasks
MAX_TOKENS=128000 # Maximum tokens for model responses
MAX_TOKENS=64000 # Maximum tokens for model responses
TEMPERATURE=0.2 # Temperature for model responses (0.0-1.0)
# Logging Configuration