-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Handle None content in stop-sequence trimming
#1826
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
src/smolagents/models.py
Outdated
| return content | ||
|
|
||
| for stop_seq in stop_sequences: | ||
| if not stop_seq: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why support having one of the stop sequences being None?
Apart from that I agree with the other introduced changes!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for spotting that @aymeric-roucher! You’re right, we don’t expect providers to return None inside stop_sequences.
| assert removed_content == "Hello" | ||
|
|
||
|
|
||
| def test_remove_content_after_stop_sequences_handles_none(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please merge these tests into one with several asserts! After that lgtm !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for the feedback and the review @aymeric-roucher.
I've merged the tests as suggested.
Happy to make it less verbose or use a different name if you prefer.
…larity and coverage for None inputs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @chahn!
|
@aymeric-roucher thanks a lot for your review, support and for merging! |
Prevent stop-sequence trimming from crashing on tool-only GPT-5 outputs
Linked Issue
Fixes Bug #1825
Summary
remove_content_after_stop_sequencesfrom crashing when a provider returnscontent=None(e.g.gpt-5tool-only turns).Nonestop list to avoid accidental truncation.Details
content=Noneandstop_sequences=Noneas no-op cases so we only trim when actual text is present.gpt-5tool-calling without changing behavior for normal string outputs.Tests
tests/test_models.py::test_remove_content_after_stop_sequences_handles_none– confirms the helper returnsNonewhen the provider emits tool-only output, mirroring the bug scenario.tests/test_models.py::test_remove_content_after_stop_sequences_ignores_empty_stop_sequences– ensures harmless empty stop strings do not strip valid text.tests/test_models.py::test_remove_content_after_stop_sequences_with_none_stop_sequences– protects the default code path wherestop_sequencesisNone, preserving existing behavior.Thanks a lot for feedback and support!