feat(ai): Add OpenRouter AI provider support

Integrates the OpenRouter AI provider using the Vercel AI SDK adapter (@openrouter/ai-sdk-provider). This allows users to configure and utilize models available through the OpenRouter platform.

- Added src/ai-providers/openrouter.js with standard Vercel AI SDK wrapper functions (generateText, streamText, generateObject).

- Updated ai-services-unified.js to include the OpenRouter provider in the PROVIDER_FUNCTIONS map and API key resolution logic.

- Verified config-manager.js handles OpenRouter API key checks correctly.

- Users can configure OpenRouter models via .taskmasterconfig using the task-master models command or MCP models tool. Requires OPENROUTER_API_KEY.

- Enhanced error handling in ai-services-unified.js to provide clearer messages when generateObjectService fails due to lack of underlying tool support in the selected model/provider endpoint.
This commit is contained in:
Eyal Toledano
2025-04-27 18:23:56 -04:00
parent 8609e24ed8
commit 5ffa5ae2a4
8 changed files with 255 additions and 34 deletions

View File

@@ -356,34 +356,6 @@
"allowed_roles": ["main", "fallback"],
"max_tokens": 1048576
},
{
"id": "meta-llama/llama-4-maverick:free",
"swe_score": 0,
"cost_per_1m_tokens": { "input": 0, "output": 0 },
"allowed_roles": ["main", "fallback"],
"max_tokens": 256000
},
{
"id": "meta-llama/llama-4-maverick",
"swe_score": 0,
"cost_per_1m_tokens": { "input": 0.17, "output": 0.6 },
"allowed_roles": ["main", "fallback"],
"max_tokens": 1048576
},
{
"id": "meta-llama/llama-4-scout:free",
"swe_score": 0,
"cost_per_1m_tokens": { "input": 0, "output": 0 },
"allowed_roles": ["main", "fallback"],
"max_tokens": 512000
},
{
"id": "meta-llama/llama-4-scout",
"swe_score": 0,
"cost_per_1m_tokens": { "input": 0.08, "output": 0.3 },
"allowed_roles": ["main", "fallback"],
"max_tokens": 1048576
},
{
"id": "google/gemma-3-12b-it:free",
"swe_score": 0,