Releases: Ivy233/llama.cpp
Releases · Ivy233/llama.cpp
b5935
b5787
Add Conv2d for CPU (#14388) * Conv2D: Add CPU version * Half decent * Tiled approach for F32 * remove file * Fix tests * Support F16 operations * add assert about size * Review: further formatting fixes, add assert and use CPU version of fp32->fp16
b5538
cmake: Guard GGML_CPU_ALL_VARIANTS by architecture (#13890)
b5516
opencl: add new ops - `argsort`, `div`, `sub`, `addrows`, `sigmoid`, …
b5350
mtmd : Use RMS norm for InternVL 3 38B and 78B mmproj (#13459)
b5307
docker : disable arm64 and intel images (#13356)
b5219
llama : llm_type order by size (#13177)
b5195
common : add common_remote_get_content (#13123) * common : add common_remote_get_content * support max size and timeout * add tests
b5171
rpc : add command line option for number of threads for the CPU backe…
b5145
opencl: fix incorrect local_size index in profiling log (#12868)