Skip to content

Commit dd3d5c6

Browse files
committed
ci : add T4 runner
1 parent 3848517 commit dd3d5c6

File tree

1 file changed

+46
-4
lines changed

1 file changed

+46
-4
lines changed

.github/workflows/build.yml

Lines changed: 46 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1300,8 +1300,8 @@ jobs:
13001300
run: |
13011301
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
13021302
1303-
ggml-ci-x64-nvidia-cuda:
1304-
runs-on: [self-hosted, Linux, X64, NVIDIA]
1303+
ggml-ci-x64-nvidia-v100-cuda:
1304+
runs-on: [self-hosted, Linux, X64, NVIDIA, V100]
13051305

13061306
steps:
13071307
- name: Clone
@@ -1314,8 +1314,8 @@ jobs:
13141314
nvidia-smi
13151315
GG_BUILD_CUDA=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
13161316
1317-
ggml-ci-x64-nvidia-vulkan:
1318-
runs-on: [self-hosted, Linux, X64, NVIDIA]
1317+
ggml-ci-x64-nvidia-v100-vulkan:
1318+
runs-on: [self-hosted, Linux, X64, NVIDIA, V100]
13191319

13201320
steps:
13211321
- name: Clone
@@ -1328,6 +1328,48 @@ jobs:
13281328
vulkaninfo
13291329
GG_BUILD_VULKAN=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
13301330
1331+
ggml-ci-x64-nvidia-t4-cuda:
1332+
runs-on: [self-hosted, Linux, X64, NVIDIA, T4]
1333+
1334+
steps:
1335+
- name: Clone
1336+
id: checkout
1337+
uses: actions/checkout@v4
1338+
1339+
- name: Test
1340+
id: ggml-ci
1341+
run: |
1342+
nvidia-smi
1343+
GG_BUILD_CUDA=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1344+
1345+
ggml-ci-x64-nvidia-t4-vulkan:
1346+
runs-on: [self-hosted, Linux, X64, NVIDIA, T4]
1347+
1348+
steps:
1349+
- name: Clone
1350+
id: checkout
1351+
uses: actions/checkout@v4
1352+
1353+
- name: Test
1354+
id: ggml-ci
1355+
run: |
1356+
vulkaninfo
1357+
GG_BUILD_VULKAN=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1358+
1359+
ggml-ci-x64-nvidia-t4-vulkan-coopmat1:
1360+
runs-on: [self-hosted, Linux, X64, NVIDIA, T4]
1361+
1362+
steps:
1363+
- name: Clone
1364+
id: checkout
1365+
uses: actions/checkout@v4
1366+
1367+
- name: Test
1368+
id: ggml-ci
1369+
run: |
1370+
vulkaninfo
1371+
GG_BUILD_VULKAN=1 GGML_VK_DISABLE_COOPMAT2=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1372+
13311373
ggml-ci-x64-cpu-amx:
13321374
runs-on: [self-hosted, Linux, X64, CPU, AMX]
13331375

0 commit comments

Comments
 (0)