fix: support Azure provider with reasoning models

- Add azure, openrouter, bedrock, and ollama to VALIDATED_PROVIDERS array
- Add Azure reasoning models (GPT-5, o1, o3, o3-mini, o4-mini) to supported-models.json
- Implement automatic API endpoint detection for Azure reasoning models
- Add dual endpoint support (chat/completions vs responses) in AzureProvider
- Add smart URL adjustment logic for different Azure configurations
- Maintain backward compatibility with existing Azure setups

Fixes #638

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Ralph Khreish <Crunchyman-ralph@users.noreply.github.com>
This commit is contained in:
claude[bot]
2025-10-14 08:00:46 +00:00
parent 0df6595245
commit 59ee1e7baa
3 changed files with 113 additions and 3 deletions

View File

@@ -997,6 +997,67 @@
"allowed_roles": ["main", "fallback"],
"max_tokens": 16384,
"supported": true
},
{
"id": "gpt-5",
"swe_score": 0.749,
"cost_per_1m_tokens": {
"input": 5.0,
"output": 20.0
},
"allowed_roles": ["main", "fallback"],
"max_tokens": 100000,
"temperature": 1,
"supported": true,
"api_type": "responses"
},
{
"id": "o1",
"swe_score": 0.489,
"cost_per_1m_tokens": {
"input": 15.0,
"output": 60.0
},
"allowed_roles": ["main"],
"max_tokens": 100000,
"supported": true,
"api_type": "responses"
},
{
"id": "o3",
"swe_score": 0.5,
"cost_per_1m_tokens": {
"input": 2.0,
"output": 8.0
},
"allowed_roles": ["main", "fallback"],
"max_tokens": 100000,
"supported": true,
"api_type": "responses"
},
{
"id": "o3-mini",
"swe_score": 0.493,
"cost_per_1m_tokens": {
"input": 1.1,
"output": 4.4
},
"allowed_roles": ["main"],
"max_tokens": 100000,
"supported": true,
"api_type": "responses"
},
{
"id": "o4-mini",
"swe_score": 0.45,
"cost_per_1m_tokens": {
"input": 1.1,
"output": 4.4
},
"allowed_roles": ["main", "fallback"],
"max_tokens": 100000,
"supported": true,
"api_type": "responses"
}
],
"bedrock": [