Skip to content

Releases: standby24x7/llama_fix.cpp

b3499

01 Aug 11:03
c8a0090

Choose a tag to compare

cann: support q8_0 for Ascend backend (#8805)

b3486

29 Jul 04:36
0832de7

Choose a tag to compare

[SYCL] add conv support (#8688)

b3432

22 Jul 04:51
45f2c19

Choose a tag to compare

flake.lock: Update (#8610)

b3416

19 Jul 04:39
a15ef8f

Choose a tag to compare

CUDA: fix partial offloading for ne0 % 256 != 0 (#8572)

b3398

15 Jul 15:45
8fac431

Choose a tag to compare

ggml : suppress unknown pragma 'GCC' on windows (#8460)

This commit adds a macro guard to pragma GCC to avoid the following
warning on windows:

```console
C:\llama.cpp\ggml\src\ggml-aarch64.c(17,9): warning C4068:
unknown pragma 'GCC' [C:\lama.cpp\build\ggml\src\ggml.vcxproj]
```

b3372

11 Jul 12:05
a977c11

Choose a tag to compare

gitignore : deprecated binaries

b3368

11 Jul 00:26
dd07a12

Choose a tag to compare

Name Migration: Build the deprecation-warning 'main' binary every tim…

b3358

10 Jul 05:01
a59f8fd

Choose a tag to compare

Server: Enable setting default sampling parameters via command-line (…

b3346

08 Jul 12:59
3f2d538

Choose a tag to compare

scripts : fix sync for sycl