feat(ai): Enhance Google provider telemetry and AI object response handling
This commit introduces two key improvements:
1. **Google Provider Telemetry:**
- Updated to include token usage data (, ) in the responses from and .
- This aligns the Google provider with others for consistent AI usage telemetry.
2. **Robust AI Object Response Handling:**
- Modified to more flexibly handle responses from .
- The add-task module now check for the AI-generated object in both and , improving compatibility with different AI provider response structures (e.g., Gemini).
These changes enhance the reliability of AI interactions, particularly with the Google provider, and ensure accurate telemetry collection.
This commit is contained in:
@@ -5039,7 +5039,7 @@
|
||||
"title": "Update google.js for Telemetry Compatibility",
|
||||
"description": "Modify src/ai-providers/google.js functions to return usage data.",
|
||||
"details": "Update the provider functions in `src/ai-providers/google.js` to ensure they return telemetry-compatible results:\\n\\n1. **`generateGoogleText`**: Return `{ text: ..., usage: { inputTokens: ..., outputTokens: ... } }`. Extract token counts from the Vercel AI SDK result.\\n2. **`generateGoogleObject`**: Return `{ object: ..., usage: { inputTokens: ..., outputTokens: ... } }`. Extract token counts.\\n3. **`streamGoogleText`**: Return the *full stream result object* returned by the Vercel AI SDK's `streamText`, not just the `textStream` property. The full object contains usage information.\\n\\nReference `anthropic.js` for the pattern.",
|
||||
"status": "pending",
|
||||
"status": "done",
|
||||
"dependencies": [],
|
||||
"parentTaskId": 77
|
||||
},
|
||||
|
||||
Reference in New Issue
Block a user