mirror of
https://github.com/leonvanzyl/autocoder.git
synced 2026-03-18 19:33:09 +00:00
refactor: make Settings UI the single source of truth for API provider
Remove legacy env-var-based provider/mode detection that caused misleading UI badges (e.g., GLM badge showing when Settings was set to Claude). Key changes: - Remove _is_glm_mode() and _is_ollama_mode() env-var sniffing functions from server/routers/settings.py; derive glm_mode/ollama_mode purely from the api_provider setting - Remove `import os` from settings router (no longer needed) - Update schema comments to reflect settings-based derivation - Remove "(configured via .env)" from badge tooltips in App.tsx - Remove Kimi/GLM/Ollama/Playwright-headless sections from .env.example; add note pointing to Settings UI - Update CLAUDE.md and README.md documentation to reference Settings UI for alternative provider configuration - Update model IDs from claude-opus-4-5-20251101 to claude-opus-4-6 across registry, client, chat sessions, tests, and UI defaults - Add LEGACY_MODEL_MAP with auto-migration in get_all_settings() - Show model ID subtitle in SettingsModal model selector - Add Vertex passthrough test for claude-opus-4-6 (no date suffix) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
54
.env.example
54
.env.example
@@ -9,11 +9,6 @@
|
||||
# - webkit: Safari engine
|
||||
# - msedge: Microsoft Edge
|
||||
# PLAYWRIGHT_BROWSER=firefox
|
||||
#
|
||||
# PLAYWRIGHT_HEADLESS: Run browser without visible window
|
||||
# - true: Browser runs in background, saves CPU (default)
|
||||
# - false: Browser opens a visible window (useful for debugging)
|
||||
# PLAYWRIGHT_HEADLESS=true
|
||||
|
||||
# Extra Read Paths (Optional)
|
||||
# Comma-separated list of absolute paths for read-only access to external directories.
|
||||
@@ -25,56 +20,17 @@
|
||||
# Google Cloud Vertex AI Configuration (Optional)
|
||||
# To use Claude via Vertex AI on Google Cloud Platform, uncomment and set these variables.
|
||||
# Requires: gcloud CLI installed and authenticated (run: gcloud auth application-default login)
|
||||
# Note: Use @ instead of - in model names (e.g., claude-opus-4-5@20251101)
|
||||
# Note: Use @ instead of - in model names for date-suffixed models (e.g., claude-sonnet-4-5@20250929)
|
||||
#
|
||||
# CLAUDE_CODE_USE_VERTEX=1
|
||||
# CLOUD_ML_REGION=us-east5
|
||||
# ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
|
||||
# ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-5@20251101
|
||||
# ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-6
|
||||
# ANTHROPIC_DEFAULT_SONNET_MODEL=claude-sonnet-4-5@20250929
|
||||
# ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-3-5-haiku@20241022
|
||||
|
||||
# ===================
|
||||
# Alternative API Providers
|
||||
# Alternative API Providers (GLM, Ollama, Kimi, Custom)
|
||||
# ===================
|
||||
# NOTE: These env vars are the legacy way to configure providers.
|
||||
# The recommended way is to use the Settings UI (API Provider section).
|
||||
# UI settings take precedence when api_provider != "claude".
|
||||
|
||||
# Kimi K2.5 (Moonshot) Configuration (Optional)
|
||||
# Get an API key at: https://kimi.com
|
||||
#
|
||||
# ANTHROPIC_BASE_URL=https://api.kimi.com/coding/
|
||||
# ANTHROPIC_API_KEY=your-kimi-api-key
|
||||
# ANTHROPIC_DEFAULT_SONNET_MODEL=kimi-k2.5
|
||||
# ANTHROPIC_DEFAULT_OPUS_MODEL=kimi-k2.5
|
||||
# ANTHROPIC_DEFAULT_HAIKU_MODEL=kimi-k2.5
|
||||
|
||||
# GLM/Alternative API Configuration (Optional)
|
||||
# To use Zhipu AI's GLM models instead of Claude, uncomment and set these variables.
|
||||
# This only affects AutoForge - your global Claude Code settings remain unchanged.
|
||||
# Get an API key at: https://z.ai/subscribe
|
||||
#
|
||||
# ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
|
||||
# ANTHROPIC_AUTH_TOKEN=your-zhipu-api-key
|
||||
# API_TIMEOUT_MS=3000000
|
||||
# ANTHROPIC_DEFAULT_SONNET_MODEL=glm-4.7
|
||||
# ANTHROPIC_DEFAULT_OPUS_MODEL=glm-4.7
|
||||
# ANTHROPIC_DEFAULT_HAIKU_MODEL=glm-4.5-air
|
||||
|
||||
# Ollama Local Model Configuration (Optional)
|
||||
# To use local models via Ollama instead of Claude, uncomment and set these variables.
|
||||
# Requires Ollama v0.14.0+ with Anthropic API compatibility.
|
||||
# See: https://ollama.com/blog/claude
|
||||
#
|
||||
# ANTHROPIC_BASE_URL=http://localhost:11434
|
||||
# ANTHROPIC_AUTH_TOKEN=ollama
|
||||
# API_TIMEOUT_MS=3000000
|
||||
# ANTHROPIC_DEFAULT_SONNET_MODEL=qwen3-coder
|
||||
# ANTHROPIC_DEFAULT_OPUS_MODEL=qwen3-coder
|
||||
# ANTHROPIC_DEFAULT_HAIKU_MODEL=qwen3-coder
|
||||
#
|
||||
# Model recommendations:
|
||||
# - For best results, use a capable coding model like qwen3-coder or deepseek-coder-v2
|
||||
# - You can use the same model for all tiers, or different models per tier
|
||||
# - Larger models (70B+) work best for Opus tier, smaller (7B-20B) for Haiku
|
||||
# Configure alternative providers via the Settings UI (gear icon > API Provider).
|
||||
# The Settings UI is the recommended way to switch providers and models.
|
||||
|
||||
Reference in New Issue
Block a user