-
Notifications
You must be signed in to change notification settings - Fork 2.5k
fix: add proxy support for OpenAI-compatible providers #7574
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
- Add proxy configuration utility that respects HTTP_PROXY, HTTPS_PROXY, and NO_PROXY environment variables - Update BaseOpenAiCompatibleProvider to use proxy configuration when available - Update OpenAiHandler to use proxy configuration for all OpenAI client variants (standard, Azure, Azure AI Inference) - Add comprehensive tests for proxy configuration handling Fixes #7573 - API connection errors with self-hosted vLLM behind corporate proxy
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wrote this code and even I can see it's only half-baked.
Critical Issues
1. Incomplete proxy support across providers
The PR only adds proxy configuration to BaseOpenAiCompatibleProvider and OpenAiHandler, but several other providers that create OpenAI clients directly are missing proxy support:
ollama.ts(line 37-42) - Creates OpenAI client without proxy configlm-studio.ts(line 27-31) - Creates OpenAI client without proxy configopenai-native.ts(line 63) - Creates OpenAI client without proxy configopenrouter.ts(line 96) - Creates OpenAI client without proxy confighuggingface.ts(line 25-29) - Creates OpenAI client without proxy config
Without adding proxy support to these providers, users behind corporate proxies still won't be able to connect to these services.
2. Missing dependency declaration
The https-proxy-agent package is used via require() but I don't see it declared in package.json dependencies. This will cause runtime errors if the package isn't installed.
Important Suggestions
3. Dynamic require usage
Using require() for dynamic imports (line 11 in proxy-config.ts) could cause bundling issues. Consider using dynamic import() or ensuring the module is properly handled by the bundler.
4. Console.warn in production
The warning messages (lines 14 and 57 in proxy-config.ts) should use a proper logging mechanism rather than console.warn.
Minor Improvements
5. Test coverage for edge cases
While the tests are comprehensive, consider adding tests for:
- Proxy URLs with authentication (e.g.,
http://user:[email protected]:8080) - IPv6 addresses in NO_PROXY
- Multiple proxy URLs in sequence
6. Documentation
Consider adding JSDoc comments to explain the proxy configuration behavior and environment variable precedence.
Summary
This PR makes a good start at addressing the proxy issue, but needs to be extended to all providers that create OpenAI clients. The missing dependency also needs to be added to package.json.
| // Dynamic import to avoid bundling issues | ||
| let HttpsProxyAgent: any | ||
| try { | ||
| HttpsProxyAgent = require("https-proxy-agent").HttpsProxyAgent |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is using require() intentional here? Dynamic requires can cause issues with bundlers. Consider using dynamic import or ensuring the bundler handles this correctly. Also, is https-proxy-agent listed in package.json dependencies?
| HttpsProxyAgent = require("https-proxy-agent").HttpsProxyAgent | ||
| } catch (error) { | ||
| // If the module is not available, return undefined | ||
| console.warn("https-proxy-agent module not available, proxy support disabled") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider using a proper logging mechanism instead of console.warn. In production, these warnings might not be visible or could clutter logs.
|
Closing this PR for now. See #7573 (comment) for discussion on the root cause and next steps. We need more information to design a comprehensive solution that covers all providers properly. |
This PR attempts to address Issue #7573 where users behind corporate proxies experience connection errors with self-hosted vLLM OpenAI-compatible APIs after upgrading from version 3.20.3.
Problem
After version 3.20.3, the OpenAI SDK (v5+) requires explicit proxy configuration when working behind corporate proxies. The application was not passing proxy configuration to the OpenAI client, causing connection failures.
Solution
This PR adds comprehensive proxy support by:
1. New Proxy Configuration Utility
src/api/providers/utils/proxy-config.tsthat:HTTP_PROXY,HTTPS_PROXY,NO_PROXY)*.internal.com)https-proxy-agentmodule2. Integration with OpenAI Providers
BaseOpenAiCompatibleProviderto use proxy configurationOpenAiHandlerto apply proxy settings for all client types:3. Comprehensive Testing
Testing
Impact
This fix enables users behind corporate proxies to connect to their self-hosted vLLM and other OpenAI-compatible APIs, restoring functionality that was working in v3.20.3.
Fixes #7573
Feedback and guidance are welcome!
Important
Adds proxy support for OpenAI-compatible providers, restoring connectivity for users behind corporate proxies.
getProxyConfiginproxy-config.ts.BaseOpenAiCompatibleProviderandOpenAiHandler.getProxyConfigrespectsHTTP_PROXY,HTTPS_PROXY,NO_PROXYenvironment variables, supports case sensitivity, and handles invalid URLs.https-proxy-agentif available.proxy-config.spec.tsfor proxy configuration, NO_PROXY patterns, case sensitivity, and error handling.This description was created by
for dc42ec0. You can customize this summary. It will automatically update as commits are pushed.