Skip to content

Conversation

@ilayaperumalg
Copy link
Member

  • Fix Azure OpenAI chat model's functioncalling to report accumulated token usage
    • Fix both call() and stream() operations
      • For streaming operation, use buffering to store the usage from the last response when stream option include usage is enabled
    • Add tests

 - Fix Azure OpenAI chat model's functioncalling to report accumulated token usage
   - Fix both call() and stream() operations
     - For streaming operation, use buffering to store the usage from the last response when stream option include usage is enabled
   - Add tests
@tzolov
Copy link
Contributor

tzolov commented Dec 17, 2024

rebased and merged at 7bbd3ef

@tzolov tzolov closed this Dec 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants