feat: add support for MCP Sampling as AI provider (#863)
* feat: support MCP sampling * support provider registry * use standard config options for MCP provider * update fastmcp to support passing params to requestSampling * move key name definition to base provider * moved check for required api key to provider class * remove unused code * more cleanup * more cleanup * refactor provider * remove not needed files * more cleanup * more cleanup * more cleanup * update docs * fix tests * add tests * format fix * clean files * merge fixes * format fix * feat: add support for MCP Sampling as AI provider * initial mcp ai sdk * fix references to old provider * update models * lint * fix gemini-cli conflicts * ran format * Update src/provider-registry/index.js Co-authored-by: Ralph Khreish <35776126+Crunchyman-ralph@users.noreply.github.com> * fix circular dependency Circular Dependency Issue ✅ FIXED Root Cause: BaseAIProvider was importing from index.js, which includes commands.js and other modules that eventually import back to AI providers Solution: Changed imports to use direct paths to avoid circular dependencies: Updated base-provider.js to import log directly from utils.js Updated gemini-cli.js to import log directly from utils.js Result: Fixed 11 failing tests in mcp-provider.test.js * fix gemini test * fix(claude-code): recover from CLI JSON truncation bug (#913) (#920) Gracefully handle SyntaxError thrown by @anthropic-ai/claude-code when the CLI truncates large JSON outputs (4–16 kB cut-offs).\n\nKey points:\n• Detect JSON parse error + existing buffered text in both doGenerate() and doStream() code paths.\n• Convert the failure into a recoverable 'truncated' finish state and push a provider-warning.\n• Allows Task Master to continue parsing long PRDs / expand-task operations instead of crashing.\n\nA patch changeset (.changeset/claude-code-json-truncation.md) is included for the next release.\n\nRef: eyaltoledano/claude-task-master#913 * docs: fix gemini-cli authentication documentation (#923) Remove erroneous 'gemini auth login' command references and replace with correct 'gemini' command authentication flow. Update documentation to reflect proper OAuth setup process via the gemini CLI interactive interface. * fix tests * fix: update ai-sdk-provider-gemini-cli to 0.0.4 for improved authentication (#932) - Fixed authentication compatibility issues with Google auth - Added support for 'api-key' auth type alongside 'gemini-api-key' - Resolved "Unsupported authType: undefined" runtime errors - Updated @google/gemini-cli-core dependency to 0.1.9 - Improved documentation and removed invalid auth references - Maintained backward compatibility while enhancing type validation * call logging directly Need to patch upstream fastmcp to allow easier access and bootstrap the TM mcp logger to use the fastmcp logger which today is only exposed in the tools handler * fix tests * removing logs until we figure out how to pass mcp logger * format * fix tests * format * clean up * cleanup * readme fix --------- Co-authored-by: Oren Melamed <oren.m@gloat.com> Co-authored-by: Ralph Khreish <35776126+Crunchyman-ralph@users.noreply.github.com> Co-authored-by: Ben Vargas <ben@vargas.com>
This commit is contained in:
@@ -4,7 +4,30 @@ Taskmaster uses two primary methods for configuration:
|
||||
|
||||
1. **`.taskmaster/config.json` File (Recommended - New Structure)**
|
||||
|
||||
- This JSON file stores most configuration settings, including AI model selections, parameters, logging levels, and project defaults.
|
||||
- This JSON file stores most configuration settings, including A5. **Usage Requirements**:
|
||||
8. **Troubleshooting**:
|
||||
- "MCP provider requires session context" → Ensure running in MCP environment
|
||||
- See the [MCP Provider Guide](./mcp-provider-guide.md) for detailed troubleshootingust be running in an MCP context (session must be available)
|
||||
- Session must provide `clientCapabilities.sampling` capability
|
||||
|
||||
6. **Best Practices**:
|
||||
- Always configure a non-MCP fallback provider
|
||||
- Use `mcp` for main/research roles when in MCP environments
|
||||
- Test sampling capability before production use
|
||||
|
||||
7. **Setup Commands**:
|
||||
```bash
|
||||
# Set MCP provider for main role
|
||||
task-master models set-main --provider mcp --model claude-3-5-sonnet-20241022
|
||||
|
||||
# Set MCP provider for research role
|
||||
task-master models set-research --provider mcp --model claude-3-opus-20240229
|
||||
|
||||
# Verify configuration
|
||||
task-master models list
|
||||
```
|
||||
|
||||
8. **Troubleshooting**:lections, parameters, logging levels, and project defaults.
|
||||
- **Location:** This file is created in the `.taskmaster/` directory when you run the `task-master models --setup` interactive setup or initialize a new project with `task-master init`.
|
||||
- **Migration:** Existing projects with `.taskmasterconfig` in the root will continue to work, but should be migrated to the new structure using `task-master migrate`.
|
||||
- **Management:** Use the `task-master models --setup` command (or `models` MCP tool) to interactively create and manage this file. You can also set specific models directly using `task-master models --set-<role>=<model_id>`, adding `--ollama` or `--openrouter` flags for custom models. Manual editing is possible but not recommended unless you understand the structure.
|
||||
@@ -173,6 +196,57 @@ node scripts/init.js
|
||||
|
||||
## Provider-Specific Configuration
|
||||
|
||||
### MCP (Model Context Protocol) Provider
|
||||
|
||||
The MCP provider enables Task Master to use MCP servers as AI providers. This is particularly useful when running Task Master within MCP-compatible development environments like Claude Desktop or Cursor.
|
||||
|
||||
1. **Prerequisites**:
|
||||
- An active MCP session with sampling capability
|
||||
- MCP client with sampling support (e.g. VS Code)
|
||||
- No API keys required (uses session-based authentication)
|
||||
|
||||
2. **Configuration**:
|
||||
```json
|
||||
{
|
||||
"models": {
|
||||
"main": {
|
||||
"provider": "mcp",
|
||||
"modelId": "mcp-sampling"
|
||||
},
|
||||
"research": {
|
||||
"provider": "mcp",
|
||||
"modelId": "mcp-sampling"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. **Available Model IDs**:
|
||||
- `mcp-sampling` - General text generation using MCP client sampling (supports all roles)
|
||||
- `claude-3-5-sonnet-20241022` - High-performance model for general tasks (supports all roles)
|
||||
- `claude-3-opus-20240229` - Enhanced reasoning model for complex tasks (supports all roles)
|
||||
|
||||
4. **Features**:
|
||||
- ✅ **Text Generation**: Standard AI text generation via MCP sampling
|
||||
- ✅ **Object Generation**: Full schema-driven structured output generation
|
||||
- ✅ **PRD Parsing**: Parse Product Requirements Documents into structured tasks
|
||||
- ✅ **Task Creation**: AI-powered task creation with validation
|
||||
- ✅ **Session Management**: Automatic session detection and context handling
|
||||
- ✅ **Error Recovery**: Robust error handling and fallback mechanisms
|
||||
|
||||
5. **Usage Requirements**:
|
||||
- Must be running in an MCP context (session must be available)
|
||||
- Session must provide `clientCapabilities.sampling` capability
|
||||
|
||||
5. **Best Practices**:
|
||||
- Always configure a non-MCP fallback provider
|
||||
- Use `mcp` for main/research roles when in MCP environments
|
||||
- Test sampling capability before production use
|
||||
|
||||
6. **Troubleshooting**:
|
||||
- "MCP provider requires session context" → Ensure running in MCP environment
|
||||
- See the [MCP Provider Guide](./mcp-provider-guide.md) for detailed troubleshooting
|
||||
|
||||
### Google Vertex AI Configuration
|
||||
|
||||
Google Vertex AI is Google Cloud's enterprise AI platform and requires specific configuration:
|
||||
|
||||
Reference in New Issue
Block a user