Skip to content

Releases: AD2605/llama.cpp

b5303

07 May 12:16
bc4e112

Choose a tag to compare

llama : deci : support ffn-free with attention (#13296)

b5283

05 May 12:30
5215b91

Choose a tag to compare

clip :  fix confused naming ffn_up and ffn_down (#13290)

* clip :  fix confused naming ffn_up and ffn_down

* rm ffn_i/o/g naming

* rename n_embd, n_ff

* small fix

* no check n_ff

b5259

02 May 10:21
2af6880

Choose a tag to compare

llama-chat : reset glmedge chat template (#13253)

* reset glmedge chat template

* fix glmedge chat template

b5184

24 Apr 16:27

Choose a tag to compare

ggml : fix trailing whitespaces (#0)

b5180

24 Apr 14:13
13b4548

Choose a tag to compare

cmake : do not include ./src as public for libllama (#13062)

* cmake : do not include ./src as public for libllama

ggml-ci

* cmake : rework tests

ggml-ci

* llguidance : remove unicode include

ggml-ci

* cmake : make c++17 private

ggml-ci

b5064

07 Apr 12:45
bd3f59f

Choose a tag to compare

cmake : enable curl by default (#12761)

* cmake : enable curl by default

* no curl if no examples

* fix build

* fix build-linux-cross

* add windows-setup-curl

* fix

* shell

* fix path

* fix windows-latest-cmake*

* run: include_directories

* LLAMA_RUN_EXTRA_LIBS

* sycl: no llama_curl

* no test-arg-parser on windows

* clarification

* try riscv64 / arm64

* windows: include libcurl inside release binary

* add msg

* fix mac / ios / android build

* will this fix xcode?

* try clearing the cache

* add bunch of licenses

* revert clear cache

* fix xcode

* fix xcode (2)

* fix typo