feat(models): implement custom model support for ollama/openrouter

Adds the ability for users to specify custom model IDs for Ollama and OpenRouter providers, bypassing the internal supported model list.

    - Introduces --ollama and --openrouter flags for the 'task-master models --set-<role>' command.
    - Updates the interactive 'task-master models --setup' to include options for entering custom Ollama/OpenRouter IDs.
    - Implements live validation against the OpenRouter API when a custom OpenRouter ID is provided.
    - Refines the model setting logic to prioritize explicit provider flags/choices.
    - Adds warnings when custom models are set.
    - Updates the changeset file.
This commit is contained in:
Eyal Toledano
2025-04-27 17:25:54 -04:00
parent ed79d4f473
commit c8722b0a7a
12 changed files with 10157 additions and 201 deletions

View File

@@ -1779,13 +1779,13 @@ export async function generateGoogleObject({
### Details:
## 28. Implement `openrouter.js` Provider Module [pending]
## 28. Implement `openrouter.js` Provider Module [in-progress]
### Dependencies: None
### Description: Create and implement the `openrouter.js` module within `src/ai-providers/`. This module should contain functions to interact with various models via OpenRouter using the **`@openrouter/ai-sdk-provider` library**, adhering to the standardized input/output format defined for `ai-services-unified.js`. Note the specific library used.
### Details:
## 29. Implement `xai.js` Provider Module using Vercel AI SDK [in-progress]
## 29. Implement `xai.js` Provider Module using Vercel AI SDK [done]
### Dependencies: None
### Description: Create and implement the `xai.js` module within `src/ai-providers/`. This module should contain functions to interact with xAI models (e.g., Grok) using the **Vercel AI SDK (`@ai-sdk/xai`)**, adhering to the standardized input/output format defined for `ai-services-unified.js`.
### Details: