Skip to content

ggml-backend : add GGML_BACKEND_DEVICE_TYPE_IGPU device type (#15797) #13

ggml-backend : add GGML_BACKEND_DEVICE_TYPE_IGPU device type (#15797)

ggml-backend : add GGML_BACKEND_DEVICE_TYPE_IGPU device type (#15797) #13

Triggered via push September 11, 2025 23:07
Status Success
Total duration 51m 48s
Artifacts 13

release.yml

on: push
Matrix: ubuntu-22-cpu
Matrix: windows-cpu
Matrix: windows-cuda
Matrix: windows-hip
Matrix: windows
Fit to window
Zoom out
Zoom in

Annotations

11 warnings
windows (opencl-adreno, arm64, -G "Ninja Multi-Config" -D CMAKE_TOOLCHAIN_FILE=cmake/arm64-window...
Cache not found for keys: ccache-windows-latest-cmake-opencl-adreno-arm64-
macOS-arm64
Cache not found for keys: ccache-macOS-latest-cmake-arm64-
windows-cpu (arm64)
Cache not found for keys: ccache-windows-latest-cmake-cpu-arm64-
ubuntu-22-cpu (x64, ubuntu-22.04)
Cache not found for keys: ccache-ubuntu-cpu-cmake-
windows (vulkan, x64, -DGGML_VULKAN=ON, ggml-vulkan)
Cache not found for keys: ccache-windows-latest-cmake-vulkan-x64-
windows-cpu (x64)
Cache not found for keys: ccache-windows-latest-cmake-cpu-x64-
ubuntu-22-vulkan
Cache not found for keys: ccache-ubuntu-22-cmake-vulkan-
macOS-x64
Cache not found for keys: ccache-macOS-latest-cmake-x64-
windows-sycl
Cache not found for keys: ccache-windows-latest-cmake-sycl-
windows-cuda (12.4)
Cache not found for keys: ccache-windows-cuda-12.4-
windows-hip (radeon, gfx1100;gfx1101;gfx1102;gfx1030;gfx1031;gfx1032)
Cache not found for keys: ccache-windows-latest-cmake-hip-radeon-x64-

Artifacts

Produced during runtime
Name Size Digest
cudart-llama-bin-win-cuda-12.4-x64.zip
372 MB
sha256:7c48355945fa55b84d6f03c5e69c6ea275d7ee5bb7802e25bc1e6686153e3270
llama-b6451-xcframework
84.3 MB
sha256:889b2970ce3aaa69f2edea08435855d733992987d4e8feb7038215ba4f0567b5
llama-bin-macos-arm64.zip
11.2 MB
sha256:6918cd03ce8abdd7e59563b9dedb608e891d7d2d89442b2c246c2b3c2584cd8c
llama-bin-macos-x64.zip
29.2 MB
sha256:079aa3d965753270f74af68202fc1bc21d6f17fdfc68d3903e3b69c65a55e4af
llama-bin-ubuntu-vulkan-x64.zip
25.3 MB
sha256:7a0adce90178669c3bc6e9565ebd1a4ed0e0763b10c171c4b442733c36c6f09d
llama-bin-ubuntu-x64.zip
13.1 MB
sha256:488cbbb9544b24d9f3971976ccfa3740ca096f75ec5e49f3264a76db36d9db8b
llama-bin-win-cpu-arm64.zip
11.3 MB
sha256:ae517c1bfc3e9013f0cd274941bf76f225343ca6ef2d3daaeff001f929855b2c
llama-bin-win-cpu-x64.zip
14.3 MB
sha256:a763e8249a11172ee82f68a5c47be95b83cbc80d71ff5f8b491f9037bcaa9e7a
llama-bin-win-cuda-12.4-x64.zip
131 MB
sha256:3b3fdc2a10e0e88c304bb9a7693c4885c852458e19b8a06ff5e2f8cfa111bb0b
llama-bin-win-hip-radeon-x64.zip
264 MB
sha256:4aa0835ac5a20607196aa6015066c95dd3f94c70a2732beaf69251441abb8696
llama-bin-win-opencl-adreno-arm64.zip
110 KB
sha256:bec12d297701960699e4a60e10d13702e56544376825bc43c2cc17313d5cc708
llama-bin-win-sycl-x64.zip
80 MB
sha256:1689e512ddd7d9df882365ad02c187dce1d4dfa1286770581ff06699f0a79656
llama-bin-win-vulkan-x64.zip
11.4 MB
sha256:243e69c956b786a9a730694a558525ec76650122bf818b4ac1cecca6995d1ba7