Skip to content

Commit 52f93f2

Browse files
committed
fix readme
1 parent 8925da2 commit 52f93f2

File tree

2 files changed

+4
-8
lines changed

2 files changed

+4
-8
lines changed

README.md

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -265,13 +265,9 @@ The [Hugging Face](https://huggingface.co) platform hosts a [number of LLMs](htt
265265
- [Trending](https://huggingface.co/models?library=gguf&sort=trending)
266266
- [LLaMA](https://huggingface.co/models?sort=trending&search=llama+gguf)
267267

268-
You can either manually download the GGUF file or directly use any `llama.cpp`-compatible models from Hugging Face by using this CLI argument: `-hf <user>/<model>[:quant]`
269-
270-
Altenatively, model can be fetched from [ModelScope](https://www.modelscope.cn) with CLI argument of `-ms <user>/<model>[:quant]`, for example, `llama-cli -ms Qwen/QwQ-32B-GGUF`. You may find models on ModelScope compatible with `llama.cpp` through:
271-
272-
- [Trending] https://www.modelscope.cn/models?libraries=GGUF
273-
274-
> You can change the download endpoint of ModelScope by using `MODELSCOPE_DOMAIN=xxx`(like MODELSCOPE_DOMAIN=www.modelscope.ai).
268+
LLAMA.CPP has supported a environment variable `HF_ENDPOINT`, you can set this to change the downloading url:
269+
- By default, HF_ENDPOINT=https://huggingface.co/
270+
- To use ModelScope, you can change to HF_ENDPOINT=https://www.modelscope.cn/
275271

276272
After downloading a model, use the CLI tools to run it locally - see below.
277273

common/arg.cpp

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2377,7 +2377,7 @@ common_params_context common_params_parser_init(common_params & params, llama_ex
23772377
? std::string("model path from which to load base model")
23782378
: string_format(
23792379
"model path (default: `models/$filename` with filename from `--hf-file` "
2380-
"or `--model-url` if set, otherwise %s), or with a protocol: hf://model-id, ms://model-id", DEFAULT_MODEL_PATH
2380+
"or `--model-url` if set, otherwise %s), or with a protocol prefix: hf://model-id, ms://model-id", DEFAULT_MODEL_PATH
23812381
),
23822382
[](common_params & params, const std::string & value) {
23832383
params.model.path = value;

0 commit comments

Comments
 (0)