chore(rules): adjusts rules based on the new config approach.

This commit is contained in:
Eyal Toledano
2025-04-21 22:44:40 -04:00
parent 515dcae965
commit abdc15eab2
7 changed files with 99 additions and 87 deletions

View File

@@ -327,22 +327,23 @@ This document provides a detailed reference for interacting with Taskmaster, cov
## Environment Variables Configuration
Taskmaster's behavior can be customized via environment variables. These affect both CLI and MCP server operation:
Taskmaster primarily uses the `.taskmasterconfig` file for configuration (models, parameters, logging level, etc.), managed via the `task-master models --setup` command. API keys are stored in either the .env file (for CLI usage) or the mcp.json (for MCP usage)
* **ANTHROPIC_API_KEY** (Required): Your Anthropic API key for Claude.
* **MODEL**: Claude model to use (default: `claude-3-opus-20240229`).
* **MAX_TOKENS**: Maximum tokens for AI responses (default: 8192).
* **TEMPERATURE**: Temperature for AI model responses (default: 0.7).
* **DEBUG**: Enable debug logging (`true`/`false`, default: `false`).
* **LOG_LEVEL**: Console output level (`debug`, `info`, `warn`, `error`, default: `info`).
* **DEFAULT_SUBTASKS**: Default number of subtasks for `expand` (default: 5).
* **DEFAULT_PRIORITY**: Default priority for new tasks (default: `medium`).
* **PROJECT_NAME**: Project name used in metadata.
* **PROJECT_VERSION**: Project version used in metadata.
* **PERPLEXITY_API_KEY**: API key for Perplexity AI (for `--research` flags).
* **PERPLEXITY_MODEL**: Perplexity model to use (default: `sonar-medium-online`).
Environment variables are used **only** for sensitive API keys related to AI providers and specific overrides like the Ollama base URL:
Set these in your `.env` file in the project root or in your environment before running Taskmaster.
* **API Keys (Required for corresponding provider):**
* `ANTHROPIC_API_KEY`
* `PERPLEXITY_API_KEY`
* `OPENAI_API_KEY`
* `GOOGLE_API_KEY`
* `GROK_API_KEY`
* `MISTRAL_API_KEY`
* `AZURE_OPENAI_API_KEY` (Requires `AZURE_OPENAI_ENDPOINT` too)
* **Endpoints (Optional/Provider Specific):**
* `AZURE_OPENAI_ENDPOINT`
* `OLLAMA_BASE_URL` (Default: `http://localhost:11434/api`)
Set these in your `.env` file in the project root (for CLI use) or within the `env` section of your `.cursor/mcp.json` file (for MCP/Cursor integration). All other settings like model choice, max tokens, temperature, logging level, etc., are now managed in `.taskmasterconfig` via `task-master models --setup`.
---
@@ -351,3 +352,8 @@ For implementation details:
* MCP server: See [`mcp.mdc`](mdc:.cursor/rules/mcp.mdc)
* Task structure: See [`tasks.mdc`](mdc:.cursor/rules/tasks.mdc)
* Workflow: See [`dev_workflow.mdc`](mdc:.cursor/rules/dev_workflow.mdc)
* CLI commands: See [`commands.mdc`](mdc:.cursor/rules/commands.mdc)
* MCP server: See [`mcp.mdc`](mdc:.cursor/rules/mcp.mdc)
* Task structure: See [`tasks.mdc`](mdc:.cursor/rules/tasks.mdc)
* Workflow: See [`dev_workflow.mdc`](mdc:.cursor/rules/dev_workflow.mdc)