Skip to content

Commit 3848517

Browse files
committed
ci : migrate ggml ci to a self-hosted runners
1 parent f432d8d commit 3848517

File tree

2 files changed

+121
-11
lines changed

2 files changed

+121
-11
lines changed

.github/workflows/build.yml

Lines changed: 120 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1247,3 +1247,123 @@ jobs:
12471247
-DGGML_CANN=on \
12481248
-DSOC_TYPE=${{ matrix.device }}
12491249
cmake --build build -j $(nproc)
1250+
1251+
ggml-ci-x64-cpu-low-perf:
1252+
runs-on: [self-hosted, Linux, X64, CPU, low-perf]
1253+
1254+
steps:
1255+
- name: Clone
1256+
id: checkout
1257+
uses: actions/checkout@v4
1258+
1259+
- name: Test
1260+
id: ggml-ci
1261+
run: |
1262+
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1263+
1264+
ggml-ci-arm64-cpu-low-perf:
1265+
runs-on: [self-hosted, Linux, ARM64, CPU, low-perf]
1266+
1267+
steps:
1268+
- name: Clone
1269+
id: checkout
1270+
uses: actions/checkout@v4
1271+
1272+
- name: Test
1273+
id: ggml-ci
1274+
run: |
1275+
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1276+
1277+
ggml-ci-x64-cpu-high-perf:
1278+
runs-on: [self-hosted, Linux, X64, CPU, high-perf]
1279+
1280+
steps:
1281+
- name: Clone
1282+
id: checkout
1283+
uses: actions/checkout@v4
1284+
1285+
- name: Test
1286+
id: ggml-ci
1287+
run: |
1288+
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1289+
1290+
ggml-ci-arm64-cpu-high-perf:
1291+
runs-on: [self-hosted, Linux, ARM64, CPU, high-perf]
1292+
1293+
steps:
1294+
- name: Clone
1295+
id: checkout
1296+
uses: actions/checkout@v4
1297+
1298+
- name: Test
1299+
id: ggml-ci
1300+
run: |
1301+
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1302+
1303+
ggml-ci-x64-nvidia-cuda:
1304+
runs-on: [self-hosted, Linux, X64, NVIDIA]
1305+
1306+
steps:
1307+
- name: Clone
1308+
id: checkout
1309+
uses: actions/checkout@v4
1310+
1311+
- name: Test
1312+
id: ggml-ci
1313+
run: |
1314+
nvidia-smi
1315+
GG_BUILD_CUDA=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1316+
1317+
ggml-ci-x64-nvidia-vulkan:
1318+
runs-on: [self-hosted, Linux, X64, NVIDIA]
1319+
1320+
steps:
1321+
- name: Clone
1322+
id: checkout
1323+
uses: actions/checkout@v4
1324+
1325+
- name: Test
1326+
id: ggml-ci
1327+
run: |
1328+
vulkaninfo
1329+
GG_BUILD_VULKAN=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1330+
1331+
ggml-ci-x64-cpu-amx:
1332+
runs-on: [self-hosted, Linux, X64, CPU, AMX]
1333+
1334+
steps:
1335+
- name: Clone
1336+
id: checkout
1337+
uses: actions/checkout@v4
1338+
1339+
- name: Test
1340+
id: ggml-ci
1341+
run: |
1342+
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
1343+
1344+
ggml-ci-mac-metal:
1345+
runs-on: [self-hosted, macOS, ARM64]
1346+
1347+
steps:
1348+
- name: Clone
1349+
id: checkout
1350+
uses: actions/checkout@v4
1351+
1352+
- name: Test
1353+
id: ggml-ci
1354+
run: |
1355+
GG_BUILD_METAL=1 bash ./ci/run.sh ~/results/llama.cpp ~/mnt/llama.cpp
1356+
1357+
# TODO: install vulkan drivers
1358+
# ggml-ci-mac-vulkan:
1359+
# runs-on: [self-hosted, macOS, ARM64]
1360+
#
1361+
# steps:
1362+
# - name: Clone
1363+
# id: checkout
1364+
# uses: actions/checkout@v4
1365+
#
1366+
# - name: Test
1367+
# id: ggml-ci
1368+
# run: |
1369+
# GG_BUILD_VULKAN=1 bash ./ci/run.sh ~/results/llama.cpp ~/mnt/llama.cpp

ci/README.md

Lines changed: 1 addition & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,6 @@
11
# CI
22

3-
In addition to [Github Actions](https://github.com/ggml-org/llama.cpp/actions) `llama.cpp` uses a custom CI framework:
4-
5-
https://github.com/ggml-org/ci
6-
7-
It monitors the `master` branch for new commits and runs the
8-
[ci/run.sh](https://github.com/ggml-org/llama.cpp/blob/master/ci/run.sh) script on dedicated cloud instances. This allows us
9-
to execute heavier workloads compared to just using Github Actions. Also with time, the cloud instances will be scaled
10-
to cover various hardware architectures, including GPU and Apple Silicon instances.
11-
12-
Collaborators can optionally trigger the CI run by adding the `ggml-ci` keyword to their commit message.
13-
Only the branches of this repo are monitored for this keyword.
3+
This CI implements heavy workflows that are running on self-hosted runners with various hardware configurations.
144

155
It is a good practice, before publishing changes to execute the full CI locally on your machine:
166

0 commit comments

Comments
 (0)