You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+7Lines changed: 7 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,6 +40,13 @@ Pull requests are the best way to propose changes to the notebooks repository:
40
40
- Run the [piplock-renewal.yaml](https://github.com/opendatahub-io/notebooks/blob/main/.github/workflows/piplock-renewal.yaml) against your fork branch, check [here](https://github.com/opendatahub-io/notebooks/blob/main/README.md) for more info.
41
41
- Test the changes locally, by manually running the `$ make jupyter-${NOTEBOOK_NAME}-ubi8-python-3.8` from the terminal.
42
42
43
+
### Working with linters
44
+
45
+
- Run pre-commit before you commit, to lint the Python sources that have been put under its management
46
+
```
47
+
uv run pre-commit run --all-files
48
+
```
49
+
- If you like, you can install pre-commit to run automatically using `uv run pre-commit install`, as per its [install instructions](https://pre-commit.com/#3-install-the-git-hook-scripts)
43
50
44
51
### Some basic instructions how to apply the new tests into [openshift-ci](https://github.com/openshift/release)
Copy file name to clipboardExpand all lines: docs/developer-guide.md
+22-20Lines changed: 22 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
The following sections are aimed to provide a comprehensive guide for developers, enabling them to understand the project's architecture and seamlessly contribute to its development.
2
2
3
3
## Getting Started
4
-
This project utilizes three branches for the development: the **main** branch, which hosts the latest development, and t**wo additional branches for each release**.
5
-
These release branches follow a specific naming format: YYYYx, where "YYYY" represents the year, and "x" is an increasing letter. Thus, they help to keep working on minor updates and bug fixes on the supported versions (N & N-1) of each workbench.
4
+
This project utilizes three branches for the development: the **main** branch, which hosts the latest development, and **two additional branches for each release**.
5
+
These release branches follow a specific naming format: YYYYx, where "YYYY" represents the year, and "x" is an increasing letter. Thus, they help to keep working on minor updates and bug fixes on the supported versions (N & N-1) of each workbench.
6
6
7
7
## Architecture
8
8
The structure of the notebook's build chain is derived from the parent image. To better comprehend this concept, refer to the following graph.
@@ -19,30 +19,30 @@ Detailed instructions on how developers can contribute to this project can be fo
19
19
## Workbench ImageStreams
20
20
21
21
ODH supports multiple out-of-the-box pre-built workbench images ([provided in this repository](https://github.com/opendatahub-io/notebooks)). For each of those workbench images, there is a dedicated ImageStream object definition. This ImageStream object references the actual image tag(s) and contains additional metadata that describe the workbench image.
22
-
22
+
23
23
### **Annotations**
24
24
25
25
Aside from the general ImageStream config values, there are additional annotations that can be provided in the workbench ImageStream definition. This additional data is leveraged further by the [odh-dashboard](https://github.com/opendatahub-io/odh-dashboard/).
26
26
27
-
### **ImageStream-specific annotations**
28
-
The following labels and annotations are specific to the particular workbench image. They are provided in their respective sections in the `metadata` section.
27
+
### **ImageStream-specific annotations**
28
+
The following labels and annotations are specific to the particular workbench image. They are provided in their respective sections in the `metadata` section.
29
29
```yaml
30
30
metadata:
31
31
labels:
32
32
...
33
33
annotations:
34
34
...
35
35
```
36
-
### **Available labels**
37
-
- **`opendatahub.io/notebook-image:`** - a flag that determines whether the ImageStream references a workbench image that is meant be shown in the UI
36
+
### **Available labels**
37
+
- **`opendatahub.io/notebook-image:`** - a flag that determines whether the ImageStream references a workbench image that is meant be shown in the UI
38
38
### **Available annotations**
39
39
- **`opendatahub.io/notebook-image-url:`** - a URL reference to the source of the particular workbench image
40
40
- **`opendatahub.io/notebook-image-name:`** - a desired display name string for the particular workbench image (used in the UI)
41
-
- **`opendatahub.io/notebook-image-desc:`** - a desired description string of the of the particular workbench image (used in the UI)
41
+
- **`opendatahub.io/notebook-image-desc:`** - a desired description string of the particular workbench image (used in the UI)
42
42
- **`opendatahub.io/notebook-image-order:`** - an index value for the particular workbench ImageStream (used by the UI to list available workbench images in a specific order)
43
43
- **`opendatahub.io/recommended-accelerators`** - a string that represents the list of recommended hardware accelerators for the particular workbench ImageStream (used in the UI)
44
44
45
-
### **Tag-specific annotations**
45
+
### **Tag-specific annotations**
46
46
One ImageStream can reference multiple image tags. The following annotations are specific to a particular workbench image tag and are provided in its `annotations:` section.
47
47
```yaml
48
48
spec:
@@ -54,17 +54,19 @@ spec:
54
54
name: image-repository/tag
55
55
name: tag-name
56
56
```
57
-
### **Available annotations**
58
-
- **`opendatahub.io/notebook-software:`** - a string that represents the technology stack included within the workbench image. Each technology in the list is described by its name and the version used (e.g. `'[{"name":"CUDA","version":"11.8"},{"name":"Python","version":"v3.9"}]`')
59
-
- **`opendatahub.io/notebook-python-dependencies:`** - a string that represents the list of Python libraries included within the workbench image. Each library is described by its name and currently used version (e.g. `'[{"name":"Numpy","version":"1.24"},{"name":"Pandas","version":"1.5"}]'`)
60
-
- **`openshift.io/imported-from:`** - a reference to the image repository where the workbench image was obtained (e.g. `quay.io/repository/opendatahub/workbench-images`)
61
-
- **`opendatahub.io/workbench-image-recommended:`** - a flag that allows the ImageStream tag to be marked as Recommended (used by the UI to distinguish which tags are recommended for use, e.g., when the workbench image offers multiple tags to choose from)
57
+
### **Available per-tag annotations**
58
+
- **`opendatahub.io/notebook-software:`** - a string that represents the technology stack included within the workbench image. Each technology in the list is described by its name and the version used (e.g. `'[{"name":"CUDA","version":"11.8"},{"name":"Python","version":"v3.9"}]`')
59
+
- **`opendatahub.io/notebook-python-dependencies:`** - a string that represents the list of Python libraries included within the workbench image. Each library is described by its name and currently used version (e.g. `'[{"name":"Numpy","version":"1.24"},{"name":"Pandas","version":"1.5"}]'`)
60
+
- **`openshift.io/imported-from:`** - a reference to the image repository where the workbench image was obtained (e.g. `quay.io/repository/opendatahub/workbench-images`)
61
+
- **`opendatahub.io/workbench-image-recommended:`** - a flag that allows the ImageStream tag to be marked as Recommended (used by the UI to distinguish which tags are recommended for use, e.g., when the workbench image offers multiple tags to choose from)
62
+
- **`opendatahub.io/image-tag-outdated:`** - a flag that determines whether the image stream will be hidden from the list of available image versions in the workbench spawner dialog. Workbenches previously started with this image will continue to function.
63
+
- **`opendatahub.io/notebook-build-commit:`** - the commit hash of the notebook image build that was used to create the image. This is shown in Dashboard webui starting with RHOAI 2.22.
62
64
63
-
### **ImageStream definitions for the supported out-of-the-box images in ODH**
65
+
### **ImageStream definitions for the supported out-of-the-box images in ODH**
64
66
65
67
The ImageStream definitions of the out-of-the-box workbench images for ODH can be found [here](https://github.com/opendatahub-io/notebooks/tree/main/manifests).
66
68
67
-
### **Example ImageStream object definition**
69
+
### **Example ImageStream object definition**
68
70
69
71
An exemplary, non-functioning ImageStream object definition that uses all the aforementioned annotations is provided below.
70
72
@@ -128,7 +130,7 @@ The Openshift CI is also configured to run the unit and integration tests:
128
130
129
131
```
130
132
tests:
131
-
- as: notebooks-e2e-tests
133
+
- as: notebooks-e2e-tests
132
134
steps:
133
135
test:
134
136
- as: ${NOTEBOOK_IMAGE_NAME}-e2e-tests
@@ -149,7 +151,7 @@ This GitHub action is configured to be triggered on a weekly basis, specifically
149
151
This GitHub action is configured to be triggered on a daily basis and synchronizes the selected projects from their upstream repositories to their downstream counterparts.
150
152
151
153
### **Digest Updater workflow on the manifests** [[Link]](https://github.com/opendatahub-io/odh-manifests/blob/master/.github/workflows/notebooks-digest-updater-upstream.yaml)
152
-
154
+
153
155
This GitHub action is designed to be triggered on a weekly basis, specifically every Friday at 12:00 AM UTC. Its primary purpose is to automate the process of updating the SHA digest of the notebooks. It achieves this by fetching the new SHA values from the quay.io registry and updating the [param.env](https://github.com/opendatahub-io/odh-manifests/blob/master/notebook-images/base/params.env) file, which is hosted on the odh-manifest repository. By automatically updating the SHA digest, this action ensures that the notebooks remain synchronized with the latest changes.
154
156
155
157
### **Digest Updater workflow on the live-builder** [[Link]](https://gitlab.cee.redhat.com/data-hub/rhods-live-builder/-/blob/main/.gitlab/notebook-sha-digest-updater.yml)
@@ -159,6 +161,6 @@ This GitHub action works with the same logic as the above and is designed to be
This GitHub action is designed to be triggered on a weekly basis. It generates a summary of security vulnerabilities reported by [Quay](https://quay.io/) for the latest images built for different versions of notebooks.
163
-
164
+
This GitHub action is designed to be triggered on a weekly basis. It generates a summary of security vulnerabilities reported by [Quay](https://quay.io/) for the latest images built for different versions of notebooks.
This Workbench image installs JupyterLab and the ODH-Elyra extension.
6
+
7
+
The main difference between the [upstream Elyra](https://github.com/elyra-ai/elyra) and the [ODH-Elyra fork](https://github.com/opendatahub-io/elyra) is that the fork implements Argo Pipelines support, which is required for executing pipelines in OpenDataHub/OpenShift AI.
8
+
Specifically, the fork already includes the changes from [elyra-ai/elyra #3273](https://github.com/elyra-ai/elyra/pull/3273), which is still pending upstream.
9
+
10
+
### Design
11
+
12
+
The workbench is based on a Source-to-Image (S2I) UBI9 Python 3.11 image.
13
+
This means—besides having Python 3.11 installed—that it also has the following
14
+
* Python virtual environment at `/opt/app-root` is activated by default
15
+
*`HOME` directory is set to `/opt/app-root/src`
16
+
* port 8888 is `EXPOSE`D by default
17
+
18
+
These characteristics are required for OpenDataHub workbenches to function.
19
+
20
+
#### Integration with OpenDataHub Notebook Controller and Notebook Dashboard
21
+
22
+
#### OpenDataHub Dashboard
23
+
24
+
Dashboard automatically populates an environment variable named `NOTEBOOK_ARGS` when starting a container from this image.
25
+
This variable contains configurations that are necessary to integrate with Dashboard regarding launching the Workbench and logging off.
Furthermore, when configuring a workbench, the default Persistent Volume Claim (PVC) is created and volume is mounted at `/opt/app-root/src` in the workbench container.
30
+
This means that changing the user's `HOME` directory from the expected default is inadvisable.
31
+
It further means that whatever the original content of `/opt/app-root/src` in the image may be, it will be shadowed by the PVC.
32
+
33
+
##### OpenDataHub Notebook Controller
34
+
35
+
During the Notebook Custom Resource (CR) creation, the mutating webhook in Notebook Controller is triggered.
36
+
This webhook is responsible for configuring OAuth Proxy, certificate bundles, pipeline runtime, runtime images, and maybe more.
37
+
It also creates a service and OpenShift route to make the Workbench reachable from the outside of the cluster.
38
+
39
+
**OAuth Proxy** is configured to connect to port 8888 of the workbench container (discussed above) and listen for incoming connections on port 8443.
**Certificate bundles** are added as a file-mounted configmap at `/etc/pki/tls/custom-certs/ca-bundle.crt`.
44
+
This is a nonstandard location, so it is necessary to also add environment variables that instruct various software to reference this bundle during operation.
**Pipeline runtime configuration** is obtained from a Data Science Pipeline Application (DSPA) CR.
51
+
The DSPA CR is first located in the same project where the workbench is being started, a secret with the connection data is created, and then this secret is mounted.
52
+
The secret is mounted under `/opt/app-root/runtimes/`.
Open the `Settings > Workbench images` page in OpenDataHub Dashboard.
76
+
Click on the `Import new image` button and add the image you have just pushed.
77
+
The `Image location` field should be set to `quay.io/your-username/jupyterlab-with-elyra:latest`, or wherever the image is pushed and available for the cluster to pull.
78
+
Values of other fields do not matter for functionality, but they let you keep better track of previously imported images.
79
+
80
+
There is a special ODH Dashboard feature that alerts you when you are using a workbench image that lists the `elyra` instead of `odh-elyra` package.
81
+
This code will have to be updated when `elyra` also gains support for Argo Pipelines, but for now it does the job.
0 commit comments