-
Notifications
You must be signed in to change notification settings - Fork 509
fix: Duplicate LLM calls #981
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR addresses duplicate LLM call issues by enhancing the instrumentation logic for CrewAI by adding provider detection logic.
- Introduces a mapping from LLM providers to specific instrumentors to selectively skip instrumentation.
- Wraps provider detection logic in a try/except block to handle potential errors during instrumentation setup.
Codecov ReportAttention: Patch coverage is
📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR fixes duplicate LLM calls by detecting when a provider’s own instrumentation is in use and skipping the extra span, refactors token attribute handling, and adds error handling around provider detection.
- Added mapping from model identifiers to known instrumentors and early-return if matched
- Reformatted callback token attribute setting for readability
- Wrapped provider-detection logic in a
try/exceptto log and avoid crashing
Comments suppressed due to low confidence (2)
agentops/instrumentation/crewai/instrumentation.py:439
- The
withblock appears unindented relative to the precedingelse:and may execute even when an instrumentor is found, breaking the intended skip logic. Please correct the indentation so that the span is only started in theelsebranch.
with tracer.start_as_current_span(f"{llm}.llm", kind=SpanKind.CLIENT, attributes={}) as span:
agentops/instrumentation/crewai/instrumentation.py:431
- [nitpick] New behavior for both matching and non-matching providers should be covered by unit tests to ensure correct skipping or instrumentation. Please add tests for cases where the provider is in the map and where it is not.
instrumentor = provider_instrumentor.get(provider.lower())
Dwij1704
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Crewai instrumentation is so much better now, Thanks @the-praxs !!
📥 Pull Request
📘 Description
Fixes duplicate LLM calls in CrewAI due to instrumentation of the
LLMclass causing another span to be producedCloses #959
🧪 Testing
The OG job posting example
📋 Future work