Commit Graph

2 Commits

Author SHA1 Message Date
Eyal Toledano
21c3cb8cda feat(telemetry): Integrate telemetry for expand-all, aggregate results
This commit implements AI usage telemetry for the `expand-all-tasks` command/tool and refactors its CLI output for clarity and consistency.

Key Changes:

1.  **Telemetry Integration for `expand-all-tasks` (Subtask 77.8):**\n    -   The `expandAllTasks` core logic (`scripts/modules/task-manager/expand-all-tasks.js`) now calls the `expandTask` function for each eligible task and collects the individual `telemetryData` returned.\n    -   A new helper function `_aggregateTelemetry` (in `utils.js`) is used to sum up token counts and costs from all individual expansions into a single `telemetryData` object for the entire `expand-all` operation.\n    -   The `expandAllTasksDirect` wrapper (`mcp-server/src/core/direct-functions/expand-all-tasks.js`) now receives and passes this aggregated `telemetryData` in the MCP response.\n    -   For CLI usage, `displayAiUsageSummary` is called once with the aggregated telemetry.

2.  **Improved CLI Output for `expand-all`:**\n    -   The `expandAllTasks` core function now handles displaying a final "Expansion Summary" box (showing Attempted, Expanded, Skipped, Failed counts) directly after the aggregated telemetry summary.\n    -   This consolidates all summary output within the core function for better flow and removes redundant logging from the command action in `scripts/modules/commands.js`.\n    -   The summary box border is green for success and red if any expansions failed.

3.  **Code Refinements:**\n    -   Ensured `chalk` and `boxen` are imported in `expand-all-tasks.js` for the new summary box.\n    -   Minor adjustments to logging messages for clarity.
2025-05-08 18:22:00 -04:00
Eyal Toledano
2517bc112c feat(ai): Integrate OpenAI provider and enhance model config
- Add OpenAI provider implementation using @ai-sdk/openai.\n- Update `models` command/tool to display API key status for configured providers.\n- Implement model-specific `maxTokens` override logic in `config-manager.js` using `supported-models.json`.\n- Improve AI error message parsing in `ai-services-unified.js` for better clarity.
2025-04-27 03:56:23 -04:00