Skip to content

Commit 14b602a

Browse files
reidliu41Yuqi Zhang
authored andcommitted
[doc] add download/list/delete HF model CLI usage (vllm-project#17940)
Signed-off-by: reidliu41 <[email protected]> Co-authored-by: reidliu41 <[email protected]> Signed-off-by: Yuqi Zhang <[email protected]>
1 parent b31f33e commit 14b602a

File tree

1 file changed

+60
-0
lines changed

1 file changed

+60
-0
lines changed

docs/source/models/supported_models.md

Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -168,6 +168,66 @@ If vLLM successfully returns text (for generative models) or hidden states (for
168168
Otherwise, please refer to [Adding a New Model](#new-model) for instructions on how to implement your model in vLLM.
169169
Alternatively, you can [open an issue on GitHub](https://github.com/vllm-project/vllm/issues/new/choose) to request vLLM support.
170170

171+
#### Download a model
172+
173+
If you prefer, you can use the Hugging Face CLI to [download a model](https://huggingface.co/docs/huggingface_hub/guides/cli#huggingface-cli-download) or specific files from a model repository:
174+
175+
```console
176+
# Download a model
177+
huggingface-cli download HuggingFaceH4/zephyr-7b-beta
178+
179+
# Specify a custom cache directory
180+
huggingface-cli download HuggingFaceH4/zephyr-7b-beta --cache-dir ./path/to/cache
181+
182+
# Download a specific file from a model repo
183+
huggingface-cli download HuggingFaceH4/zephyr-7b-beta eval_results.json
184+
```
185+
186+
#### List the downloaded models
187+
188+
Use the Hugging Face CLI to [manage models](https://huggingface.co/docs/huggingface_hub/guides/manage-cache#scan-your-cache) stored in local cache:
189+
190+
```console
191+
# List cached models
192+
huggingface-cli scan-cache
193+
194+
# Show detailed (verbose) output
195+
huggingface-cli scan-cache -v
196+
197+
# Specify a custom cache directory
198+
huggingface-cli scan-cache --dir ~/.cache/huggingface/hub
199+
```
200+
201+
#### Delete a cached model
202+
203+
Use the Hugging Face CLI to interactively [delete downloaded model](https://huggingface.co/docs/huggingface_hub/guides/manage-cache#clean-your-cache) from the cache:
204+
205+
```console
206+
# The `delete-cache` command requires extra dependencies to work with the TUI.
207+
# Please run `pip install huggingface_hub[cli]` to install them.
208+
209+
# Launch the interactive TUI to select models to delete
210+
$ huggingface-cli delete-cache
211+
? Select revisions to delete: 1 revisions selected counting for 438.9M.
212+
○ None of the following (if selected, nothing will be deleted).
213+
Model BAAI/bge-base-en-v1.5 (438.9M, used 1 week ago)
214+
◉ a5beb1e3: main # modified 1 week ago
215+
216+
Model BAAI/bge-large-en-v1.5 (1.3G, used 1 week ago)
217+
○ d4aa6901: main # modified 1 week ago
218+
219+
Model BAAI/bge-reranker-base (1.1G, used 4 weeks ago)
220+
○ 2cfc18c9: main # modified 4 weeks ago
221+
222+
Press <space> to select, <enter> to validate and <ctrl+c> to quit without modification.
223+
224+
# Need to confirm after selected
225+
? Select revisions to delete: 1 revision(s) selected.
226+
? 1 revisions selected counting for 438.9M. Confirm deletion ? Yes
227+
Start deletion.
228+
Done. Deleted 1 repo(s) and 0 revision(s) for a total of 438.9M.
229+
```
230+
171231
#### Using a proxy
172232

173233
Here are some tips for loading/downloading models from Hugging Face using a proxy:

0 commit comments

Comments
 (0)