-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Fix a bug where debug logging fails with Iterable objects #1683
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Codex Review: Here are some suggestions.
Reply with @codex fix comments to fix any unresolved comments.
About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".
| if _debug.DONT_LOG_MODEL_DATA: | ||
| logger.debug("Calling LLM") | ||
| else: | ||
| messages_json = json.dumps( | ||
| _to_dump_compatible(converted_messages), | ||
| indent=2, | ||
| ensure_ascii=False, | ||
| ) | ||
| tools_json = json.dumps( | ||
| _to_dump_compatible(converted_tools), | ||
| indent=2, | ||
| ensure_ascii=False, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[P1] Avoid exhausting iterator inputs during debug logging
The new _to_dump_compatible call builds JSON strings from converted_messages and converted_tools solely for logging, but the same objects are later sent to litellm.acompletion. When those structures contain generators or other one-shot iterables—the precise case this helper targets—the conversion step consumes them, so the subsequent API call receives empty content or tool definitions. Logging needs to operate on a copy and reuse that converted structure for the request, or otherwise clone the iterables before iterating.
Useful? React with 👍 / 👎.
| if _debug.DONT_LOG_MODEL_DATA: | ||
| logger.debug("Calling LLM") | ||
| else: | ||
| messages_json = json.dumps( | ||
| _to_dump_compatible(converted_messages), | ||
| indent=2, | ||
| ensure_ascii=False, | ||
| ) | ||
| tools_json = json.dumps( | ||
| _to_dump_compatible(converted_tools), | ||
| indent=2, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[P1] Prevent logging helper from draining message iterables
Here _to_dump_compatible is invoked to stringify converted_messages and converted_tools, but the original objects are still forwarded to chat.completions.create. Any generator or iterator nested inside (e.g., content lists produced lazily) will be exhausted while building the log string, leaving the API call with empty data. Consider converting the inputs once and using the converted copy both for logging and for the request so that iterables are not consumed prematurely.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@seratch i think codex is correct here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, yeah. This is a good catch!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@codex can you fix this comment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary
- Materialized chat completion messages and tool payloads once and reused them for logging and API requests so iterator-based inputs are not consumed during debug output. src/agents/models/openai_chatcompletions.pyL231-L295
- Applied the same dump-compatible conversion reuse to the LiteLLM model wrapper, ensuring third-party completions receive untouched iterable content. src/agents/extensions/models/litellm_model.pyL259-L335
- Converted Responses API inputs and tool definitions a single time for both logging and request submission, preventing generators from being exhausted before dispatch. src/agents/models/openai_responses.pyL243-L323
Testing
- ✅
make format - ✅
make lint ⚠️ make mypy(fails: missing optional dependencies numpy, litellm, sqlalchemy)⚠️ make tests(fails during collection because optional dependencies litellm and numpy are not installed)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I took over the changes ^ locally and pushed the commit
| if _debug.DONT_LOG_MODEL_DATA: | ||
| logger.debug("Calling LLM") | ||
| else: | ||
| messages_json = json.dumps( | ||
| _to_dump_compatible(converted_messages), | ||
| indent=2, | ||
| ensure_ascii=False, | ||
| ) | ||
| tools_json = json.dumps( | ||
| _to_dump_compatible(converted_tools), | ||
| indent=2, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@seratch i think codex is correct here?
c61b6e1 to
3ead6ef
Compare
|
@rm-openai This one should be good to go now |
This pull request resolves an existing bug where the JSON format debug logging raises a Runtime exception when a given data has
Iterableobjects within it.