[Question]: How to set HTTP proxy for calling AI providers? #6476
-
What features would you like to see added?Does LibreChat support setting HTTP Proxy for outgoing API calls to AI providers? More detailsI'm attempting to configure LibreChat to route all API calls to AI models through a proxy server running on my host machine (macOS). I've set the PROXY environment variable in my .env file as follows:
However, after restarting the Docker containers using Please let me know if I am missing something. Which components are impacted by your request?General PicturesNo response Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 6 replies
-
Can you share which AI providers? most are configured to use an HTTP proxy if one is set If possible, please also share your librechat.yaml file |
Beta Was this translation helpful? Give feedback.
-
Hello @danny-avila, I’m new to LibreChat. I haven’t made any changes to the .yaml file, so it’s still the default one. I would like to set an HTTP proxy for the following providers: OpenAI, Google, AWS Bedrock, and Anthropic. Could you please let me know if there’s a global configuration option for this? |
Beta Was this translation helpful? Give feedback.
Hi @danny-avila,
I tested the
<PROVIDER>_REVERSE_PROXY
option for Google, OpenAI, and Anthropic, and it worked perfectly. Thank you for your kind support!I was wondering if it would be possible to:
My goal is to monitor AI usages of AD users of the enterprise.