feat(ai): Integrate OpenAI provider and enhance model config

- Add OpenAI provider implementation using @ai-sdk/openai.\n- Update `models` command/tool to display API key status for configured providers.\n- Implement model-specific `maxTokens` override logic in `config-manager.js` using `supported-models.json`.\n- Improve AI error message parsing in `ai-services-unified.js` for better clarity.
This commit is contained in:
Eyal Toledano
2025-04-27 03:56:23 -04:00
parent 842eaf7224
commit 2517bc112c
21 changed files with 1350 additions and 662 deletions

View File

@@ -48,7 +48,7 @@ This rule guides AI assistants on how to view, configure, and interact with the
- **`mistral`**: Requires `MISTRAL_API_KEY`.
- **`azure`**: Requires `AZURE_OPENAI_API_KEY` and `AZURE_OPENAI_ENDPOINT`.
- **`openrouter`**: Requires `OPENROUTER_API_KEY`.
- **`ollama`**: Typically requires `OLLAMA_API_KEY` *and* `OLLAMA_BASE_URL` (default: `http://localhost:11434/api`). *Check specific setup.*
- **`ollama`**: Might require `OLLAMA_API_KEY` (not currently supported) *and* `OLLAMA_BASE_URL` (default: `http://localhost:11434/api`). *Check specific setup.*
- **Troubleshooting:**
- If AI commands fail (especially in MCP context):