Skip to content

Commit 48240cb

Browse files
committed
Merge remote-tracking branch 'origin/yogesh-xxx-initial-commit' into yogesh-xxx-initial-commit
# Conflicts: # src/gaiaflow/managers/utils.py
2 parents 92b3d02 + 2868bb8 commit 48240cb

File tree

14 files changed

+191
-53
lines changed

14 files changed

+191
-53
lines changed

.github/workflows/publish.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ jobs:
3636

3737
- name: Install and Test
3838
run:
39-
pixi run test
39+
pixi run test --cov=gaiaflow --cov-report=xml
4040

4141
- name: Upload coverage reports to Codecov
4242
uses: codecov/codecov-action@v4

.github/workflows/unittest.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ jobs:
3434

3535
- name: Install and Test
3636
run:
37-
pixi run test
37+
pixi run test --cov=gaiaflow --cov-report=xml
3838

3939
- name: Upload coverage reports to Codecov
4040
uses: codecov/codecov-action@v4

docs/dev_guide.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -99,10 +99,13 @@ data, models, and logging** (metrics, params, and artifacts to MLflow).
9999
- Learn how to use MLflow for experiment tracking.
100100
- Perform inference on MLflow-logged models.
101101
- If your data comes from S3 (hosted by BC), it’s best to **download a small
102-
sample** and upload it to your local S3 storage (MinIO) for development and testing.
102+
sample** and upload it to your local S3 storage (MinIO) for development and testing.
103+
_NOTE: This is recommended because there are egress-costs (costs that occur when
104+
the data is pulled out from the AWS ecosystem) for everytime you pull the data._
103105
- For local datasets, upload them to MinIO as well. Later, when your workflow
104106
moves to production, ensure the data is available in S3.
105-
`Talk to Tejas about uploading data to S3`
107+
`aws s3 sync /path/to/source /path/to/target --delete`
108+
Use `--delete` if you want the target to look exactly as the source
106109
---
107110

108111
#### 2. Refactor to Production Code
@@ -168,7 +171,7 @@ it inside a Docker environment:
168171
gaiaflow dev dockerize -p .
169172
```
170173

171-
- This builds an image for your package `Talk to Tejas`.
174+
- This builds an image for your package
172175
- Use the generated image name in the `image` parameter of `create_task`.
173176
- Pass environment variables via the `env_vars` parameter.
174177
- Set `mode = "dev_docker"` and trigger your workflow again.

docs/index.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,8 @@ gaiaflow prod-local create-secret -p . --name <your-secret-name> --data SOME_KEY
205205
- Runs tasks as **Kubernetes pods in Minikube**.
206206
- Requires `image`
207207
- You can pass in `env_vars` and/or `secrets` if any.
208-
- For creating secrets in your minikube cluster, (coming soon, `Talk to Tejas`)
208+
- For creating secrets in your minikube cluster _(coming soon)_
209+
209210

210211
Quick rule of thumb:
211212

docs/prod_guide.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -142,21 +142,21 @@ You’ll need to configure the following secrets in your repository:
142142

143143
### 3. Task Secrets
144144
Any **secrets required by your tasks** must also be made available to the cluster running Airflow.
145-
For this step, `Talk to Tejas`
146-
145+
_(coming soon)_
147146
---
148147

149-
### 4. GitHub Release Workflow
150-
When you create a **GitHub release**, the following steps are triggered automatically:
148+
### 4. GitHub Release
149+
150+
To enable Airflow running in production to access your DAGs and your
151+
task packaged as an image, you must make a release following semantic versioning.
152+
153+
To release your package, you can do it via Github UI.
154+
155+
To make the release work successfully, you have to request the AWS access
156+
credentials from the infra team.
151157

152-
- A Docker image is built and pushed to **AWS ECR**
153-
- Your DAGs are uploaded to **S3**
154-
- A **dispatch event** is sent to the CDR
155-
- CDR pulls DAGs from S3
156-
- Airflow reads DAGs from the CDR
158+
Add this as a **GitHub repository secret** so your CI can use it.
157159

158-
For this to work, your team should provide you with an **AWS Role** that allows CI to push to ECR and S3.
159-
Add this role as a **GitHub repository secret** so your CI can use it.
160160

161161
---
162162

pixi.lock

Lines changed: 77 additions & 28 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -48,6 +48,7 @@ mkdocs = ">=1.6.1,<2"
4848
mkdocs-material = ">=9.6.17,<10"
4949
mkdocstrings = ">=0.30.0,<0.31"
5050
mkdocstrings-python = ">=1.17.0,<2"
51+
pytest-cov = ">=6.2.1,<7"
5152

5253
[project.scripts]
5354
gaiaflow = "gaiaflow.cli.cli:app"

src/gaiaflow/cli/commands/minikube.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@
44
import fsspec
55
import typer
66

7+
from gaiaflow.constants import DEFAULT_IMAGE_NAME
8+
79
app = typer.Typer()
810
fs = fsspec.filesystem("file")
911

@@ -103,6 +105,8 @@ def restart(
103105
"minikube cluster.")
104106
def dockerize(
105107
project_path: Path = typer.Option(..., "--path", "-p", help="Path to your project"),
108+
image_name: str = typer.Option(DEFAULT_IMAGE_NAME, "--image-name", "-i",
109+
help=("Name of your image.")),
106110
):
107111
imports = load_imports()
108112
gaiaflow_path, user_project_path = imports.create_gaiaflow_context_path(
@@ -117,6 +121,7 @@ def dockerize(
117121
user_project_path=user_project_path,
118122
action=imports.ExtendedAction.DOCKERIZE,
119123
local=False,
124+
image_name=image_name
120125
)
121126

122127

src/gaiaflow/cli/commands/mlops.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
import fsspec
66
import typer
77

8-
from gaiaflow.constants import Service
8+
from gaiaflow.constants import Service, DEFAULT_IMAGE_NAME
99

1010
app = typer.Typer()
1111
fs = fsspec.filesystem("file")
@@ -240,6 +240,8 @@ def cleanup(
240240
@app.command(help="Containerize your package into a docker image locally.")
241241
def dockerize(
242242
project_path: Path = typer.Option(..., "--path", "-p", help="Path to your project"),
243+
image_name: str = typer.Option(DEFAULT_IMAGE_NAME, "--image-name", "-i",
244+
help=("Name of your image.")),
243245
):
244246
imports = load_imports()
245247
gaiaflow_path, user_project_path = imports.create_gaiaflow_context_path(
@@ -260,6 +262,7 @@ def dockerize(
260262
user_project_path=user_project_path,
261263
action=imports.ExtendedAction.DOCKERIZE,
262264
local=True,
265+
image_name=image_name
263266
)
264267

265268

src/gaiaflow/constants.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -82,3 +82,5 @@ class Service(str, Enum):
8282
"limit_gpu": "1",
8383
},
8484
}
85+
86+
DEFAULT_IMAGE_NAME = "user-image:v1"

0 commit comments

Comments
 (0)