-
Notifications
You must be signed in to change notification settings - Fork 509
Add LLM_CONTENT_COMPLETION_CHUNK attribute to SpanAttributes class #885
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
tcdent
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch!
@tcdent @the-praxs was going through the implementation and he noticed that you have specifically mentioned Can you give me some clarity on how you wish to manage both of these conventions, manually or via semconv? |
|
This catch is a good reminder to have tests for attributes in some way. The order of merge caused this unexpected error. |
…tes. Update attribute references for LLM_COMPLETIONS, LLM_CONTENT_COMPLETION_CHUNK, and LLM_RESPONSE_MODEL to ensure consistency across the codebase.
…r enhanced semantic conventions in LangChain integration.
Codecov ReportAll modified and coverable lines are covered by tests ✅ 📢 Thoughts on this report? Let us know! |
|
I fixed the error, but when testing it, I can't see token usage in any of the spans for Agents SDK |
Token usage fixed in #888 |
📥 Pull Request
📘 Description
When running chat.completions using
stream=True, it throws outAn error occurred: type object 'SpanAttributes' has no attribute 'LLM_CONTENT_COMPLETION_CHUNK'🧪 Testing
Added LLM_CONTENT_COMPLETION_CHUNK attribute and tested using the following script