chore(docs): update docs and rules related to model management.

This commit is contained in:
Eyal Toledano
2025-04-27 17:32:59 -04:00
parent c8722b0a7a
commit 3516efdc3b
6 changed files with 62 additions and 7 deletions

View File

@@ -32,6 +32,18 @@ This rule guides AI assistants on how to view, configure, and interact with the
- ❌ **DON'T:** `models(setMain='openai:gpt-4o')` or `task-master models --set-main=openai:gpt-4o`
- The tool/command will automatically determine the provider based on the model ID.
- **Setting Custom Models (Ollama/OpenRouter):**
- To set a model ID not in the internal list for Ollama or OpenRouter:
- **MCP Tool:** Use `models` with `set<Role>` and **also** `ollama: true` or `openrouter: true`.
- Example: `models(setMain='my-custom-ollama-model', ollama=true)`
- Example: `models(setMain='some-openrouter-model', openrouter=true)`
- **CLI Command:** Use `task-master models` with `--set-<role>` and **also** `--ollama` or `--openrouter`.
- Example: `task-master models --set-main=my-custom-ollama-model --ollama`
- Example: `task-master models --set-main=some-openrouter-model --openrouter`
- **Interactive Setup:** Use `task-master models --setup` and select the `Ollama (Enter Custom ID)` or `OpenRouter (Enter Custom ID)` options.
- **OpenRouter Validation:** When setting a custom OpenRouter model, Taskmaster attempts to validate the ID against the live OpenRouter API.
- **Ollama:** No live validation occurs for custom Ollama models; ensure the model is available on your Ollama server.
- **Supported Providers & Required API Keys:**
- Task Master integrates with various providers via the Vercel AI SDK.
- **API keys are essential** for most providers and must be configured correctly.

View File

@@ -59,21 +59,25 @@ This document provides a detailed reference for interacting with Taskmaster, cov
### 2. Manage Models (`models`)
* **MCP Tool:** `models`
* **CLI Command:** `task-master models [options]`
* **Description:** `View the current AI model configuration or set specific models for different roles (main, research, fallback).`
* **Description:** `View the current AI model configuration or set specific models for different roles (main, research, fallback). Allows setting custom model IDs for Ollama and OpenRouter.`
* **Key MCP Parameters/Options:**
* `setMain <model_id>`: `Set the primary model ID for task generation/updates.` (CLI: `--set-main <model_id>`)
* `setResearch <model_id>`: `Set the model ID for research-backed operations.` (CLI: `--set-research <model_id>`)
* `setFallback <model_id>`: `Set the model ID to use if the primary fails.` (CLI: `--set-fallback <model_id>`)
* `ollama <boolean>`: `Indicates the set model ID is a custom Ollama model.` (CLI: `--ollama`)
* `openrouter <boolean>`: `Indicates the set model ID is a custom OpenRouter model.` (CLI: `--openrouter`)
* `listAvailableModels <boolean>`: `If true, lists available models not currently assigned to a role.` (CLI: No direct equivalent; CLI lists available automatically)
* `projectRoot <string>`: `Optional. Absolute path to the project root directory.` (CLI: Determined automatically)
* **Key CLI Options:**
* `--set-main <model_id>`: `Set the primary model.`
* `--set-research <model_id>`: `Set the research model.`
* `--set-fallback <model_id>`: `Set the fallback model.`
* `--setup`: `Run interactive setup to configure models and other settings.`
* **Usage (MCP):** Call without set flags to get current config. Use `setMain`, `setResearch`, or `setFallback` with a valid model ID to update the configuration. Use `listAvailableModels: true` to get a list of unassigned models.
* **Usage (CLI):** Run without flags to view current configuration and available models. Use set flags to update specific roles. Use `--setup` for guided configuration.
* **Notes:** Configuration is stored in `.taskmasterconfig` in the project root. This command/tool modifies that file. Use `listAvailableModels` to ensure the selected model is supported.
* `--ollama`: `Specify that the provided model ID is for Ollama (use with --set-*).`
* `--openrouter`: `Specify that the provided model ID is for OpenRouter (use with --set-*). Validates against OpenRouter API.`
* `--setup`: `Run interactive setup to configure models, including custom Ollama/OpenRouter IDs.`
* **Usage (MCP):** Call without set flags to get current config. Use `setMain`, `setResearch`, or `setFallback` with a valid model ID to update the configuration. Use `listAvailableModels: true` to get a list of unassigned models. To set a custom model, provide the model ID and set `ollama: true` or `openrouter: true`.
* **Usage (CLI):** Run without flags to view current configuration and available models. Use set flags to update specific roles. Use `--setup` for guided configuration, including custom models. To set a custom model via flags, use `--set-<role>=<model_id>` along with either `--ollama` or `--openrouter`.
* **Notes:** Configuration is stored in `.taskmasterconfig` in the project root. This command/tool modifies that file. Use `listAvailableModels` or `task-master models` to see internally supported models. OpenRouter custom models are validated against their live API. Ollama custom models are not validated live.
* **API note:** API keys for selected AI providers (based on their model) need to exist in the mcp.json file to be accessible in MCP context. The API keys must be present in the local .env file for the CLI to be able to read them.
* **Warning:** DO NOT MANUALLY EDIT THE .taskmasterconfig FILE. Use the included commands either in the MCP or CLI format as needed. Always prioritize MCP tools when available and use the CLI as a fallback.

View File

@@ -209,3 +209,30 @@ task-master add-task --prompt="Description" --priority=high
# Initialize a new project with Task Master structure
task-master init
```
## Configure AI Models
```bash
# View current AI model configuration and API key status
task-master models
# Set the primary model for generation/updates (provider inferred if known)
task-master models --set-main=claude-3-opus-20240229
# Set the research model
task-master models --set-research=sonar-pro
# Set the fallback model
task-master models --set-fallback=claude-3-haiku-20240307
# Set a custom Ollama model for the main role
task-master models --set-main=my-local-llama --ollama
# Set a custom OpenRouter model for the research role
task-master models --set-research=google/gemini-pro --openrouter
# Run interactive setup to configure models, including custom ones
task-master models --setup
```
Configuration is stored in `.taskmasterconfig` in your project root. API keys are still managed via `.env` or MCP configuration. Use `task-master models` without flags to see available built-in models. Use `--setup` for a guided experience.

View File

@@ -6,7 +6,7 @@ Taskmaster uses two primary methods for configuration:
- This JSON file stores most configuration settings, including AI model selections, parameters, logging levels, and project defaults.
- **Location:** Create this file in the root directory of your project.
- **Management:** Use the `task-master models --setup` command (or `models` MCP tool) to interactively create and manage this file. Manual editing is possible but not recommended unless you understand the structure.
- **Management:** Use the `task-master models --setup` command (or `models` MCP tool) to interactively create and manage this file. You can also set specific models directly using `task-master models --set-<role>=<model_id>`, adding `--ollama` or `--openrouter` flags for custom models. Manual editing is possible but not recommended unless you understand the structure.
- **Example Structure:**
```json
{
@@ -82,7 +82,7 @@ PERPLEXITY_API_KEY=pplx-your-key-here
### Configuration Errors
- If Task Master reports errors about missing configuration or cannot find `.taskmasterconfig`, run `task-master models --setup` in your project root to create or repair the file.
- Ensure API keys are correctly placed in your `.env` file (for CLI) or `.cursor/mcp.json` (for MCP) and are valid.
- Ensure API keys are correctly placed in your `.env` file (for CLI) or `.cursor/mcp.json` (for MCP) and are valid for the providers selected in `.taskmasterconfig`.
### If `task-master init` doesn't respond:

View File

@@ -1976,6 +1976,18 @@ function registerCommands(programInstance) {
'--ollama',
'Allow setting a custom Ollama model ID (use with --set-*) '
)
.addHelpText(
'after',
`
Examples:
$ task-master models # View current configuration
$ task-master models --set-main gpt-4o # Set main model (provider inferred)
$ task-master models --set-research sonar-pro # Set research model
$ task-master models --set-fallback claude-3-5-sonnet-20241022 # Set fallback
$ task-master models --set-main my-custom-model --ollama # Set custom Ollama model for main role
$ task-master models --set-main some/other-model --openrouter # Set custom OpenRouter model for main role
$ task-master models --setup # Run interactive setup`
)
.action(async (options) => {
const projectRoot = findProjectRoot(); // Find project root for context