Skip to content

Commit dd8d9bd

Browse files
authored
Merge pull request Jacoo-Zhao#12 from Jacoo-Zhao/feature-cicd
Feature: CICD in task level
2 parents 0a91f68 + f424134 commit dd8d9bd

File tree

6 files changed

+45
-30
lines changed

6 files changed

+45
-30
lines changed

.github/workflows/pipeline.yaml

Lines changed: 15 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,18 @@
1-
name: Run ClearML Pipeline
1+
name: Run ClearML Task
22

33
on:
4-
push:
5-
branches: [ main ]
4+
pull_request:
5+
branches: [main]
66

77
jobs:
88
run_pipeline:
99
runs-on: ubuntu-latest
1010

11+
env: # ✅ env: # Global environment variables for all steps
12+
CLEARML_API_ACCESS_KEY: ${{ secrets.CLEARML_API_ACCESS_KEY }}
13+
CLEARML_API_SECRET_KEY: ${{ secrets.CLEARML_API_SECRET_KEY }}
14+
CLEARML_API_HOST: ${{ secrets.CLEARML_API_HOST }}
15+
1116
steps:
1217
- uses: actions/checkout@v3
1318

@@ -21,10 +26,12 @@ jobs:
2126
python -m pip install --upgrade pip
2227
pip install -r requirements.txt
2328
29+
- name: Debug ENV
30+
run: |
31+
echo "CLEARML_API_HOST=$CLEARML_API_HOST"
32+
if [ -z "$CLEARML_API_HOST" ]; then echo "❌ HOST is empty!"; exit 1; fi
33+
curl -I $CLEARML_API_HOST
34+
2435
- name: Run pipeline
25-
env:
26-
CLEARML_API_ACCESS_KEY: ${{ secrets.CLEARML_API_ACCESS_KEY }}
27-
CLEARML_API_SECRET_KEY: ${{ secrets.CLEARML_API_SECRET_KEY }}
28-
CLEARML_API_HOST: ${{ secrets.CLEARML_API_HOST }}
2936
run: |
30-
python main.py
37+
python s1_dataset_artifact.py

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,3 @@
11
.DS_Store
2-
.idea
2+
.idea
3+
figs/

s1_dataset_artifact.py

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
"s1_dataset_artifact.py"
2+
3+
from clearml import Task, StorageManager
4+
import os
5+
6+
# create an dataset experiment
7+
task = Task.init(project_name="AI_Studio_Demo", task_name="Pipeline step 1 dataset artifact")
8+
9+
# only create the task, we will actually execute it later
10+
# task.execute_remotely()
11+
12+
# Check if the local dataset file exists
13+
local_iris_csv_path = 'work_dataset/Iris.csv'
14+
if not os.path.exists(local_iris_csv_path):
15+
print(f"Local file '{local_iris_csv_path}' not found. Downloading...")
16+
local_iris_pkl = StorageManager.get_local_copy(
17+
remote_url='https://github.com/allegroai/events/raw/master/odsc20-east/generic/iris_dataset.pkl'
18+
)
19+
else:
20+
print(f"Using existing local file: '{local_iris_csv_path}'")
21+
22+
# Add and upload the dataset file
23+
task.upload_artifact('dataset', artifact_object=local_iris_csv_path)
24+
print('uploading artifacts in the background')
25+
26+
# we are done
27+
print('Done')
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -112,6 +112,6 @@ def forward(self, x):
112112
disp.plot(cmap=plt.cm.Blues)
113113

114114
plt.title('Confusion Matrix')
115-
plt.savefig('assets/confusion_matrix.png')
115+
plt.savefig('figs/confusion_matrix.png')
116116

117117
print('Confusion matrix plotted and saved as confusion_matrix.png')

step1_dataset_artifact.py

Lines changed: 0 additions & 20 deletions
This file was deleted.

0 commit comments

Comments
 (0)