fix(ai): Align Perplexity provider with standard telemetry response structure
This commit updates the Perplexity AI provider () to ensure its functions return data in a structure consistent with other providers and the expectations of the unified AI service layer (). Specifically: - now returns an object instead of only the text string. - now returns an object instead of only the result object. These changes ensure that can correctly extract both the primary AI-generated content and the token usage data for telemetry purposes when Perplexity models are used. This resolves issues encountered during E2E testing where complexity analysis (which can use Perplexity for its research role) failed due to unexpected response formats. The function was already compliant.
This commit is contained in:
@@ -399,7 +399,7 @@ Update the provider functions in `src/ai-providers/openai.js` to ensure they ret
|
||||
### Details:
|
||||
Update the provider functions in `src/ai-providers/openrouter.js` to ensure they return telemetry-compatible results:\n\n1. **`generateOpenRouterText`**: Return `{ text: ..., usage: { inputTokens: ..., outputTokens: ... } }`. Extract token counts from the Vercel AI SDK result.\n2. **`generateOpenRouterObject`**: Return `{ object: ..., usage: { inputTokens: ..., outputTokens: ... } }`. Extract token counts.\n3. **`streamOpenRouterText`**: Return the *full stream result object* returned by the Vercel AI SDK's `streamText`, not just the `textStream` property. The full object contains usage information.\n\nReference `anthropic.js` for the pattern.
|
||||
|
||||
## 16. Update perplexity.js for Telemetry Compatibility [pending]
|
||||
## 16. Update perplexity.js for Telemetry Compatibility [done]
|
||||
### Dependencies: None
|
||||
### Description: Modify src/ai-providers/perplexity.js functions to return usage data.
|
||||
### Details:
|
||||
|
||||
Reference in New Issue
Block a user