-
Notifications
You must be signed in to change notification settings - Fork 509
fix: LiteLLM and OpenAI SDK tracking #677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
agentops/llms/tracker.py
Outdated
| self.litellm_initialized = False | ||
|
|
||
| def _is_litellm_call(self): | ||
| """ | ||
| Detects if the API call originated from LiteLLM. | ||
| Returns True if LiteLLM appears in the call stack **before** OpenAI. | ||
| """ | ||
| stack = inspect.stack() | ||
|
|
||
| litellm_seen = False # Track if LiteLLM was encountered | ||
| openai_seen = False # Track if OpenAI was encountered | ||
|
|
||
| for frame in stack: | ||
| module = inspect.getmodule(frame.frame) | ||
|
|
||
| module_name = module.__name__ if module else None | ||
|
|
||
| filename = frame.filename.lower() | ||
|
|
||
| if module_name and "litellm" in module_name or "litellm" in filename: | ||
| print("LiteLLM detected.") | ||
| litellm_seen = True | ||
|
|
||
| if module_name and "openai" in module_name or "openai" in filename: | ||
| print("OpenAI detected.") | ||
| openai_seen = True | ||
|
|
||
| if not litellm_seen: | ||
| return False | ||
|
|
||
| return litellm_seen | ||
|
|
||
| def override_api(self): | ||
| """ | ||
| Overrides key methods of the specified API to record events. | ||
| """ | ||
|
|
||
| litellm_initialized = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The litellm_initialized variable is declared twice - once as instance variable and once as local variable in override_api(). The local variable is never used, making the instance variable ineffective.
📝 Committable Code Suggestion
‼️ Ensure you review the code suggestion before committing it to the branch. Make sure it replaces the highlighted code, contains no missing lines, and has no issues with indentation.
| self.litellm_initialized = False | |
| def _is_litellm_call(self): | |
| """ | |
| Detects if the API call originated from LiteLLM. | |
| Returns True if LiteLLM appears in the call stack **before** OpenAI. | |
| """ | |
| stack = inspect.stack() | |
| litellm_seen = False # Track if LiteLLM was encountered | |
| openai_seen = False # Track if OpenAI was encountered | |
| for frame in stack: | |
| module = inspect.getmodule(frame.frame) | |
| module_name = module.__name__ if module else None | |
| filename = frame.filename.lower() | |
| if module_name and "litellm" in module_name or "litellm" in filename: | |
| print("LiteLLM detected.") | |
| litellm_seen = True | |
| if module_name and "openai" in module_name or "openai" in filename: | |
| print("OpenAI detected.") | |
| openai_seen = True | |
| if not litellm_seen: | |
| return False | |
| return litellm_seen | |
| def override_api(self): | |
| """ | |
| Overrides key methods of the specified API to record events. | |
| """ | |
| litellm_initialized = False | |
| self.litellm_initialized = False | |
| def _is_litellm_call(self): | |
| """ | |
| Detects if the API call originated from LiteLLM. | |
| Returns True if LiteLLM appears in the call stack **before** OpenAI. | |
| """ | |
| stack = inspect.stack() | |
| litellm_seen = False # Track if LiteLLM was encountered | |
| openai_seen = False # Track if OpenAI was encountered | |
| for frame in stack: | |
| module = inspect.getmodule(frame.frame) | |
| module_name = module.__name__ if module else None | |
| filename = frame.filename.lower() | |
| if module_name and "litellm" in module_name or "litellm" in filename: | |
| print("LiteLLM detected.") | |
| litellm_seen = True | |
| if module_name and "openai" in module_name or "openai" in filename: | |
| print("OpenAI detected.") | |
| openai_seen = True | |
| if not litellm_seen: | |
| return False | |
| return litellm_seen | |
| def override_api(self): | |
| """ | |
| Overrides key methods of the specified API to record events. | |
| """ |
Codecov ReportAttention: Patch coverage is
📢 Thoughts on this report? Let us know! |
areibman
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works! Thanks
dot-agi
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this check should be done in the litellm.py file instead.
This keeps the code clean.
|
@Dwij1704 can you please check whether the formatters are passing in your code? The Static Analysis test is failing for that reason. I am pushing a fix for the integration test so that test should start working. |
dot-agi
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LFGOO🚀🚀
📥 Pull Request
📘 Description
Refactored LlmTracker to ensure proper instrumentation of OpenAI and LiteLLM calls. Now, OpenAI is only tracked when explicitly called, preventing duplicate instrumentation when used via LiteLLM. Improved call stack detection to differentiate between direct OpenAI calls and LiteLLM-wrapped calls.
🧪 Testing
Executed a test script covering multiple LLM providers (Anthropic, OpenAI, LiteLLM). Verified that:
✅ LiteLLM does not override OpenAI instrumentation when used explicitly.
✅ Calls to OpenAI and Anthropic through LiteLLM are correctly tracked.
✅ Direct OpenAI and Anthropic API calls function as expected.
✅ No duplicate tracking or unintended overrides occur.
EDIT (by @the-praxs): Closes #655