feat(telemetry): Implement AI usage telemetry pattern and apply to add-task
This commit introduces a standardized pattern for capturing and propagating AI usage telemetry (cost, tokens, model used) across the Task Master stack and applies it to the 'add-task' functionality. Key changes include: - **Telemetry Pattern Definition:** - Added defining the integration pattern for core logic, direct functions, MCP tools, and CLI commands. - Updated related rules (, , Usage: mcp [OPTIONS] COMMAND [ARGS]... MCP development tools ╭─ Options ──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ │ --help Show this message and exit. │ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Commands ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ │ version Show the MCP version. │ │ dev Run a MCP server with the MCP Inspector. │ │ run Run a MCP server. │ │ install Install a MCP server in the Claude desktop app. │ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯, , ) to reference the new telemetry rule. - **Core Telemetry Implementation ():** - Refactored the unified AI service to generate and return a object alongside the main AI result. - Fixed an MCP server startup crash by removing redundant local loading of and instead using the imported from for cost calculations. - Added to the object. - ** Integration:** - Modified (core) to receive from the AI service, return it, and call the new UI display function for CLI output. - Updated to receive from the core function and include it in the payload of its response. - Ensured (MCP tool) correctly passes the through via . - Updated to correctly pass context (, ) to the core function and rely on it for CLI telemetry display. - **UI Enhancement:** - Added function to to show telemetry details in the CLI. - **Project Management:** - Added subtasks 77.6 through 77.12 to track the rollout of this telemetry pattern to other AI-powered commands (, , , , , , ). This establishes the foundation for tracking AI usage across the application.
This commit is contained in:
@@ -51,7 +51,7 @@ function getClient(apiKey) {
|
||||
* @param {Array<object>} params.messages - The messages array (e.g., [{ role: 'user', content: '...' }]).
|
||||
* @param {number} [params.maxTokens] - Maximum tokens for the response.
|
||||
* @param {number} [params.temperature] - Temperature for generation.
|
||||
* @returns {Promise<string>} The generated text content.
|
||||
* @returns {Promise<object>} The generated text content and usage.
|
||||
* @throws {Error} If the API call fails.
|
||||
*/
|
||||
export async function generateAnthropicText({
|
||||
@@ -76,7 +76,14 @@ export async function generateAnthropicText({
|
||||
'debug',
|
||||
`Anthropic generateText result received. Tokens: ${result.usage.completionTokens}/${result.usage.promptTokens}`
|
||||
);
|
||||
return result.text;
|
||||
// Return both text and usage
|
||||
return {
|
||||
text: result.text,
|
||||
usage: {
|
||||
inputTokens: result.usage.promptTokens,
|
||||
outputTokens: result.usage.completionTokens
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
log('error', `Anthropic generateText failed: ${error.message}`);
|
||||
// Consider more specific error handling or re-throwing a standardized error
|
||||
@@ -160,7 +167,7 @@ export async function streamAnthropicText({
|
||||
* @param {number} [params.maxTokens] - Maximum tokens for the response.
|
||||
* @param {number} [params.temperature] - Temperature for generation.
|
||||
* @param {number} [params.maxRetries] - Max retries for validation/generation.
|
||||
* @returns {Promise<object>} The generated object matching the schema.
|
||||
* @returns {Promise<object>} The generated object matching the schema and usage.
|
||||
* @throws {Error} If generation or validation fails.
|
||||
*/
|
||||
export async function generateAnthropicObject({
|
||||
@@ -204,7 +211,14 @@ export async function generateAnthropicObject({
|
||||
'debug',
|
||||
`Anthropic generateObject result received. Tokens: ${result.usage.completionTokens}/${result.usage.promptTokens}`
|
||||
);
|
||||
return result.object;
|
||||
// Return both object and usage
|
||||
return {
|
||||
object: result.object,
|
||||
usage: {
|
||||
inputTokens: result.usage.promptTokens,
|
||||
outputTokens: result.usage.completionTokens
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
// Simple error logging
|
||||
log(
|
||||
|
||||
Reference in New Issue
Block a user