File tree Expand file tree Collapse file tree 1 file changed +29
-0
lines changed Expand file tree Collapse file tree 1 file changed +29
-0
lines changed Original file line number Diff line number Diff line change @@ -160,6 +160,35 @@ If vLLM successfully returns text (for generative models) or hidden states (for
160
160
Otherwise, please refer to [ Adding a New Model] ( #new-model ) for instructions on how to implement your model in vLLM.
161
161
Alternatively, you can [ open an issue on GitHub] ( https://github.com/vllm-project/vllm/issues/new/choose ) to request vLLM support.
162
162
163
+ #### Using a proxy
164
+
165
+ Here are some tips for loading/downloading models from Hugging Face using a proxy:
166
+
167
+ - Set the proxy globally for your session (or set it in the profile file):
168
+
169
+ ``` shell
170
+ export http_proxy=http://your.proxy.server:port
171
+ export https_proxy=http://your.proxy.server:port
172
+ ```
173
+
174
+ - Set the proxy for just the current command:
175
+
176
+ ``` shell
177
+ https_proxy=http://your.proxy.server:port huggingface-cli download < model_name>
178
+
179
+ # or use vllm cmd directly
180
+ https_proxy=http://your.proxy.server:port vllm serve < model_name> --disable-log-requests
181
+ ```
182
+
183
+ - Set the proxy in Python interpreter:
184
+
185
+ ``` python
186
+ import os
187
+
188
+ os.environ[' http_proxy' ] = ' http://your.proxy.server:port'
189
+ os.environ[' https_proxy' ] = ' http://your.proxy.server:port'
190
+ ```
191
+
163
192
### ModelScope
164
193
165
194
To use models from [ ModelScope] ( https://www.modelscope.cn ) instead of Hugging Face Hub, set an environment variable:
You can’t perform that action at this time.
0 commit comments