Skip to content

fix(responses): Auto-add text config for gpt-5-mini models to prevent empty output_text #2548

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

Abhi-Balijepalli
Copy link

Changes being requested

Issue: #2545
This PR addresses the issue where gpt-5-mini models return empty output_text in the Responses API by automatically adding text configuration when none is provided.

What was fixed:

  • Added helper function _ensure_text_config_for_gpt5_mini() that automatically adds {"format": {"type": "text"}} for gpt-5-mini models
  • Integrated the fix into the main create() method to ensure all gpt-5-mini calls get proper text configuration
  • Fixed a bug where undefined verbosity parameter was causing NameError
  • Maintained backward compatibility - existing code works without changes

Why this was needed:

  • gpt-5-mini models default to reasoning mode, resulting in empty output_text
  • Users were experiencing confusion when response.output_text returned empty strings
  • The fix ensures consistent behavior and provides sensible defaults

Technical details:

  • Minimal changes: Only affects gpt-5-mini models when no text config is provided
  • Automatic: No user intervention required
  • Non-breaking: Existing functionality preserved
  • Tested: Verified with both gpt-5-mini and gpt-4o-mini models

Additional context & links

Issue: gpt-5-mini Returns Empty output_text with Responses API

  • Models default to reasoning mode regardless of text configuration
  • This appears to be intentional model behavior, not a bug
  • The fix provides a workaround while maintaining model capabilities

Testing:

  • ✅ Helper function tests pass
  • ✅ API integration working correctly
  • ✅ No breaking changes introduced
  • ✅ Backward compatibility maintained

@Abhi-Balijepalli Abhi-Balijepalli requested a review from a team as a code owner August 11, 2025 04:29
@PierrunoYT
Copy link

@Abhi-Balijepalli Can you check if it also fixes my issue?

@RobertCraigie
Copy link
Collaborator

Thanks for the PR! This isn't something we want to patch over in the SDK by modifying the requests that are sent, please report this on the community forum (community.openai.com) to get this fixed in the API itself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants