Skip to content

Conversation

yaojingguo
Copy link
Contributor

@yaojingguo yaojingguo commented Aug 6, 2025

In China mainland, people usually use a VPN client to access Google. Such a client usually configures macOS proxy server settings.

openai-python uses a httpx.Client with trust_env set to True. Such a httpx.Client uses macOS proxy server settings.

Add httpx_trust_env to allow users ignore proxy server settings.

Summary by CodeRabbit

  • New Features
    • Added an option to control whether environment proxy settings are used for API requests, allowing users to enable or disable this behavior as needed.

Copy link
Contributor

coderabbitai bot commented Aug 6, 2025

Walkthrough

A new boolean parameter, httpx_trust_env, has been added to the OpenAICompatible class constructor in openai_compatible.py. This parameter determines whether the internal httpx.Client used by the OpenAI and AzureOpenAI clients should respect environment proxy settings. The default is set to True, maintaining previous behavior unless explicitly overridden.

Changes

Cohort / File(s) Change Summary
OpenAICompatible class: HTTP client configuration
lmms_eval/models/simple/openai_compatible.py
Added httpx_trust_env parameter to the class constructor; conditionally creates and injects a custom httpx.Client with trust_env set according to the parameter; updates how the HTTP client is passed to OpenAI/AzureOpenAI clients.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant OpenAICompatible
    participant httpx.Client
    participant OpenAI/AzureOpenAI

    User->>OpenAICompatible: Instantiate (httpx_trust_env=True/False)
    OpenAICompatible->>OpenAICompatible: Check httpx_trust_env value
    alt httpx_trust_env is False
        OpenAICompatible->>httpx.Client: Create with trust_env=False
        OpenAICompatible->>OpenAI/AzureOpenAI: Pass custom http_client
    else httpx_trust_env is True
        OpenAICompatible->>OpenAI/AzureOpenAI: Use default http_client
    end
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~7 minutes

Suggested reviewers

  • Luodian

Poem

A toggle for proxies, a clever new switch,
Now OpenAICompatible can scratch every itch.
Trust the environment, or keep it at bay—
Just set the flag, and you’re on your way!
With code so precise, the review is a breeze,
The bunnies approve, with carrots and ease. 🥕

Note

⚡️ Unit Test Generation is now available in beta!

Learn more here, or try it out under "Finishing Touches" below.


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c05731a and 4a9f516.

📒 Files selected for processing (1)
  • lmms_eval/models/simple/openai_compatible.py (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • lmms_eval/models/simple/openai_compatible.py
✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
lmms_eval/models/simple/openai_compatible.py (1)

42-52: Parameter implementation looks good with minor docstring format suggestion.

The new parameter follows coding guidelines with proper type hints, default value for backward compatibility, and comprehensive documentation.

Consider using standard docstring format for consistency:

-        """
-        :param httpx_trust_env: bool
-            httpx.Client used by openai-python has trust_env set to True by
-            default. A False value of this param constructs a httpx.Client with
-            trust_env set to False.  Such a httpx.Client ignores environment
-            variables (HTTP_PROXY, HTTPS_PROXY, ALL_PROXY) and macOS proxy
-            server settings.
-        """
+        """Initialize OpenAI compatible model.
+        
+        Args:
+            httpx_trust_env: Controls whether httpx.Client should trust environment
+                proxy settings. When False, creates a client that ignores environment
+                variables (HTTP_PROXY, HTTPS_PROXY, ALL_PROXY) and macOS proxy
+                server settings. Defaults to True for backward compatibility.
+        """
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0aaff1d and c05731a.

📒 Files selected for processing (1)
  • lmms_eval/models/simple/openai_compatible.py (3 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

📄 CodeRabbit Inference Engine (CLAUDE.md)

**/*.py: Type hints required for all code
Public APIs must have docstrings
Functions must be focused and small
Line length: 88 chars maximum
PEP 8 naming (snake_case for functions/variables)
Class names in PascalCase
Constants in UPPER_SNAKE_CASE
Document with docstrings
Use f-strings for formatting
Use early returns to avoid nested conditions
Use clear variable/function names (prefix handlers with "handle")
Use constants where possible instead of functions
Don't repeat yourself (DRY code)
Prefer functional, immutable approaches when not verbose
Define composing functions before their components
Mark issues in existing code with "TODO:" prefix
Use functional and stateless approaches where they improve clarity
Keep core logic clean and push implementation details to the edges
Ruff: Line length (88 chars), import sorting (I001), unused imports
Strings: use parentheses for line wrapping
Function calls: multi-line with proper indent
Imports: split into multiple lines
Explicit None checks for Optional
Type narrowing for strings
Ruff (Python) runs as pre-commit hook
Break strings with parentheses for line length
Multi-line function calls for line length
Split imports for line length
Add None checks for Optional types
Narrow string types
Document public APIs

Files:

  • lmms_eval/models/simple/openai_compatible.py
🔇 Additional comments (4)
lmms_eval/models/simple/openai_compatible.py (4)

24-24: LGTM!

The import of DefaultHttpxClient is correctly added and necessary for the new functionality.


76-76: LGTM!

The HTTP client creation logic correctly handles both cases:

  • When httpx_trust_env=True: Uses default behavior (http_client=None)
  • When httpx_trust_env=False: Creates custom client with trust_env=False

78-80: LGTM!

The HTTP client parameter is correctly passed to both OpenAI and AzureOpenAI constructors, enabling the proxy control functionality for both client types.


76-81: OpenAI Client http_client usage verified

Confirmed that the OpenAI Python library supports the http_client parameter on both OpenAI and AzureOpenAI constructors, and that DefaultHttpxClient is the library-provided httpx.Client. Your current implementation aligns with the documented usage—no changes required.

@kcz358
Copy link
Collaborator

kcz358 commented Aug 7, 2025

Hi, I think you might also need to run pre-commit run --all-files to pass the lint check

@Luodian
Copy link
Contributor

Luodian commented Aug 7, 2025

@yaojingguo Could you add some comments (similar to your messages in this PR) in the code to make people aware of the reason behind this change?

PixPin_2025-08-07_10-38-27

@yaojingguo yaojingguo force-pushed the add-model-arg-for-trust-env branch from c05731a to 4a9f516 Compare August 7, 2025 06:09
@yaojingguo
Copy link
Contributor Author

Update as suggested. Please have a look.

@kcz358 @Luodian

@Luodian Luodian merged commit b4bc12a into EvolvingLMMs-Lab:main Aug 12, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants