server : enable cache_prompt by default #16990
build.yml
on: pull_request
Matrix: windows-latest-cmake-cuda
Matrix: windows-latest-cmake-hip-release
Matrix: windows-latest-cmake
macOS-latest-cmake-arm64
11m 44s
macOS-latest-cmake-x64
5m 32s
ubuntu-focal-make
3m 40s
ubuntu-latest-cmake
2m 28s
macOS-latest-make
3m 8s
macOS-latest-cmake
11m 33s
ubuntu-focal-make-curl
2m 37s
ubuntu-latest-cmake-rpc
2m 30s
ubuntu-22-cmake-vulkan
2m 48s
ubuntu-22-cmake-hip
18m 11s
ubuntu-22-cmake-musa
11m 39s
ubuntu-22-cmake-sycl
4m 43s
ubuntu-22-cmake-sycl-fp16
4m 37s
macOS-latest-cmake-ios
1m 29s
macOS-latest-cmake-tvos
1m 21s
windows-latest-cmake-sycl
12m 2s
windows-latest-cmake-hip
30m 6s
ios-xcode-build
2m 19s
android-build
6m 9s
Matrix: macOS-latest-swift
Matrix: ubuntu-latest-cmake-sanitizer
Matrix: windows-msys2
release
0s
Annotations
1 error
|
windows-latest-cmake (avx512-x64, -DGGML_NATIVE=OFF -DLLAMA_BUILD_SERVER=ON -DGGML_RPC=ON -DGGML_...
Process completed with exit code 1.
|
Artifacts
Produced during runtime
| Name | Size | Digest | |
|---|---|---|---|
|
llama-bin-win-sycl-x64.zip
Expired
|
89.1 MB |
sha256:5a2a9b4eb8222bc0320b7bf41ad7b08284accd6910a1ef3bc192731880b56f75
|
|