-
Notifications
You must be signed in to change notification settings - Fork 571
feat(integrations): openai-agents: add usage and response model reporting for chat and invoke_agent spans #5157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
…ting for chat and invoke_agent spans
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## master #5157 +/- ##
==========================================
+ Coverage 83.97% 83.99% +0.01%
==========================================
Files 180 181 +1
Lines 18222 18251 +29
Branches 3235 3239 +4
==========================================
+ Hits 15302 15330 +28
Misses 1929 1929
- Partials 991 992 +1
|
| model = original_get_model(agent, run_config) | ||
| original_get_response = model.get_response | ||
|
|
||
| # Wrap _fetch_response if it exists (for OpenAI models) to capture raw response model |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
out of curiosity, why do we need to capture the raw response model?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the only place I found to get the response model. OpenAI Agents does not expose this, we only get the request model, but not the response.
Issues
Closes https://linear.app/getsentry/issue/TET-1457/py-openai-agents-attributes-missing