Skip to content

Conversation

@pxkundu
Copy link
Contributor

@pxkundu pxkundu commented Sep 14, 2025

Fix #377: Resolve OpenAI client streaming/non-streaming parser mixing

Problem

The OpenAI client gets "stuck" in streaming or non-streaming mode after switching between stream=True and stream=False calls. This happens because the self.response_parser instance variable is modified during each call and persists across subsequent calls, causing the wrong parser to be used.

Root Cause

The issue occurs in these lines:

  • self.response_parser = self.streaming_response_parser_sync (for streaming)
  • self.response_parser = self.non_streaming_response_parser (for non-streaming)

The instance variable assignment means:

  1. Call 1: generator(stream=True) → Sets self.response_parser = streaming_parser
  2. Call 2: generator(stream=False) → Sets self.response_parser = non_streaming_parser
  3. Call 3: generator(stream=True) → But parse_chat_completion() still uses the old self.response_parser

Solution

  • Replace instance variable assignment with dynamic parser selection
  • Determine correct parser based on completion type in parse_chat_completion()
  • Add type-specific logic: Response → non-streaming, AsyncIterable → async streaming, Iterable → sync streaming
  • Exclude basic types (str, bytes, dict) from streaming detection to avoid false positives
  • Remove problematic self.response_parser assignments

Benefits

  • 🔄 Dynamic Mode Switching: Clients can reliably switch between streaming/non-streaming
  • 🎯 Type-Based Selection: Parser chosen based on actual completion type, not previous calls
  • 🛡️ Robust Detection: Excludes basic iterables (strings, dicts) that aren't streaming responses
  • 🧹 Cleaner Logic: Eliminates confusing instance variable mutations

Testing

  • ✅ Verified dynamic parser selection for all completion types
  • ✅ Tested mode switching: streaming → non-streaming → streaming
  • ✅ Confirmed fallback behavior for unknown types
  • ✅ Ensured basic types (strings) use non-streaming parser
  • ✅ Validated backward compatibility with existing functionality

Code Changes

  1. Modified parse_chat_completion(): Dynamic parser selection based on completion type
  2. Updated sync call path: Removed self.response_parser assignment
  3. Updated async call path: Removed self.response_parser assignment
  4. Enhanced type detection: Added exclusions for basic types

Type of Change

  • Bug fix (non-breaking change which fixes an issue)
  • Improves reliability for production use cases

Impact

This fix resolves a critical issue that made the OpenAI client unreliable when users needed to dynamically switch between streaming and non-streaming modes in the same application session.

Fixes #377

…orm compatibility

- Replace subprocess.call(['wget', ...]) with urllib.request.urlretrieve()
- Fix dataset download failure on Windows and minimal Docker images
- Add improved error handling with specific HTTP status codes
- Ensure directory creation before download
- Maintain backward compatibility and all existing functionality

Resolves: 'FileNotFoundError: The system cannot find the file specified'
on Windows when downloading BigBenchHard datasets.
…rser mixing

- Replace problematic instance variable assignment with dynamic parser selection
- Fix issue where self.response_parser persisted across calls causing mode confusion
- Add type-specific logic to distinguish Response, AsyncIterable, and Iterable objects
- Exclude basic types (str, bytes, dict) from streaming detection
- Ensure correct parser is always selected based on completion type

Resolves: OpenAI client getting 'stuck' in streaming or non-streaming mode
after switching between stream=True and stream=False calls.
- Fix bedrock client AWS credential import issue with lazy initialization
- Update OpenAI client tests to reflect dynamic parser selection behavior
- Remove dependency on response_parser instance variable in tests
- Ensure all tests pass with the new parser switching implementation

This resolves the CI test failures in PR SylphAI-Inc#446 while maintaining the fix for issue SylphAI-Inc#377.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

openai client non-streaming does not work after the client was set in streaming

1 participant