Skip to content

Conversation

@danielholanda
Copy link
Contributor

@danielholanda danielholanda commented Feb 2, 2026

Description

This PR:

  • Enables LLAMA_CURL=ON for both Windows and Ubuntu builds
  • Windows: Install curl via vcpkg, include libcurl.dll in distribution
  • Ubuntu: Install libcurl4-openssl-dev build dependency
  • Adds curl functionality tests using -mu (model URL) flag to verify that the direct HuggingFace model downloads works

This allows users to download GGUF models directly from URLs (e.g., HuggingFace) using llama-cli -mu without needing to pre-download files.

To test, run

llama-cli.exe -mu "https://huggingface.co/unsloth/Qwen3-0.6B-GGUF/resolve/main/Qwen3-0.6B-Q4_0.gguf" -ngl 99 -p "Hello" -n 5 -st

Closes #6

@danielholanda danielholanda self-assigned this Feb 2, 2026
@danielholanda danielholanda marked this pull request as ready for review February 4, 2026 22:05
@danielholanda danielholanda mentioned this pull request Feb 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Curl Support

1 participant