docs: auto-update documentation based on changes in next branch

This PR was automatically generated to update documentation based on recent changes.

  Original commit: feat: Add Codex CLI provider with OAuth authentication (#1273)\n\nCo-authored-by: Ralph Khreish <35776126+Crunchyman-ralph@users.noreply.github.com>\n\n

  Co-authored-by: Claude <claude-assistant@anthropic.com>
This commit is contained in:
github-actions[bot]
2025-10-05 20:13:58 +00:00
parent 4b5473860b
commit 62c4a25569
6 changed files with 1053 additions and 1 deletions

View File

@@ -75,6 +75,7 @@ Taskmaster uses two primary methods for configuration:
- `AZURE_OPENAI_API_KEY`: Your Azure OpenAI API key (also requires `AZURE_OPENAI_ENDPOINT`).
- `OPENROUTER_API_KEY`: Your OpenRouter API key.
- `XAI_API_KEY`: Your X-AI API key.
- `OPENAI_CODEX_API_KEY`: Your OpenAI API key for Codex CLI (optional - OAuth preferred).
- **Optional Endpoint Overrides:**
- **Per-role `baseURL` in `.taskmasterconfig`:** You can add a `baseURL` property to any model role (`main`, `research`, `fallback`) to override the default API endpoint for that provider. If omitted, the provider's standard endpoint is used.
- **Environment Variable Overrides (`<PROVIDER>_BASE_URL`):** For greater flexibility, especially with third-party services, you can set an environment variable like `OPENAI_BASE_URL` or `MISTRAL_BASE_URL`. This will override any `baseURL` set in the configuration file for that provider. This is the recommended way to connect to OpenAI-compatible APIs.
@@ -316,4 +317,89 @@ Azure OpenAI provides enterprise-grade OpenAI models through Microsoft's Azure c
- Confirm the model is deployed in your Azure OpenAI resource
- Verify the deployment name matches your configuration exactly (case-sensitive)
- Ensure the model deployment is in a "Succeeded" state in Azure OpenAI Studio
- Ensure youre not getting rate limited by `maxTokens` maintain appropriate Tokens per Minute Rate Limit (TPM) in your deployment.
- Ensure youre not getting rate limited by `maxTokens` maintain appropriate Tokens per Minute Rate Limit (TPM) in your deployment.
### Codex CLI Configuration
The Codex CLI provider integrates with OpenAI's ChatGPT subscription through the Codex CLI tool, providing access to advanced models like GPT-5 and GPT-5-Codex.
1. **Prerequisites**:
- Node.js >= 18.0.0
- Codex CLI >= 0.42.0 (>= 0.44.0 recommended)
- Active ChatGPT subscription (Plus, Pro, Business, Edu, or Enterprise)
2. **Installation**:
```bash
# Install Codex CLI globally
npm install -g @openai/codex
# Authenticate with your ChatGPT account
codex login
```
3. **Configuration Options**:
**Basic Configuration**
```json
// In .taskmaster/config.json
{
"models": {
"main": {
"provider": "codex-cli",
"modelId": "gpt-5-codex",
"maxTokens": 128000,
"temperature": 0.2
}
}
}
```
**Advanced Configuration with Codex CLI Settings**
```json
{
"models": {
"main": {
"provider": "codex-cli",
"modelId": "gpt-5-codex",
"maxTokens": 128000,
"temperature": 0.2
}
},
"codexCli": {
"allowNpx": true,
"skipGitRepoCheck": true,
"approvalMode": "on-failure",
"sandboxMode": "workspace-write",
"verbose": false,
"commandSpecific": {
"parse-prd": {
"approvalMode": "never",
"verbose": true
},
"expand": {
"sandboxMode": "read-only"
}
}
}
}
```
4. **Authentication**:
- **Primary Method**: OAuth authentication via `codex login` (recommended)
- **Optional**: Set `OPENAI_CODEX_API_KEY` environment variable as fallback
- Note: API key doesn't provide access to subscription-only models like GPT-5-Codex
5. **Available Settings**:
- **`allowNpx`** (boolean): Allow fallback to `npx @openai/codex` if CLI not found
- **`skipGitRepoCheck`** (boolean): Skip git repository safety check
- **`approvalMode`** (string): Control command approval (`"untrusted"`, `"on-failure"`, `"on-request"`, `"never"`)
- **`sandboxMode`** (string): Control filesystem access (`"read-only"`, `"workspace-write"`, `"danger-full-access"`)
- **`codexPath`** (string): Custom path to Codex CLI executable
- **`cwd`** (string): Working directory for execution
- **`verbose`** (boolean): Enable verbose logging
- **`commandSpecific`** (object): Override settings for specific commands
6. **Security Notes**:
- Default settings prioritize safety with approval and sandbox modes
- Use `fullAuto: true` or `dangerouslyBypassApprovalsAndSandbox: true` with extreme caution
- Codebase analysis is automatically enabled for this provider