Skip to content

Commit 831c1e6

Browse files
Merge pull request opendatahub-io#62 from gmfrasca/update-manifests-readme
chore: Update ODH Manifests README
2 parents a351755 + ab2b029 commit 831c1e6

File tree

1 file changed

+26
-22
lines changed

1 file changed

+26
-22
lines changed

manifests/opendatahub/README.md

Lines changed: 26 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# ML Pipelines
1+
# Data Science Pipelines
22

3-
ML Pipelines is the Open Data Hub's pipeline solution for data scientists. It is built on top of the upstream [Kubeflow Piplines](https://github.com/kubeflow/pipelines) and [kfp-tekton](https://github.com/kubeflow/kfp-tekton) projects. The Open Data Hub community has a [fork](https://github.com/opendatahub-io/data-science-pipelines) of this upstream under the Open Data Hub org.
3+
Data Science Pipelines is the Open Data Hub's pipeline solution for data scientists. It is built on top of the upstream [Kubeflow Piplines](https://github.com/kubeflow/pipelines) and [kfp-tekton](https://github.com/kubeflow/kfp-tekton) projects. The Open Data Hub community has a [fork](https://github.com/opendatahub-io/data-science-pipelines) of this upstream under the Open Data Hub org.
44

55

66
## Installation
@@ -10,60 +10,64 @@ ML Pipelines is the Open Data Hub's pipeline solution for data scientists. It is
1010
1. The cluster needs to be OpenShift 4.9 or higher
1111
2. OpenShift Pipelines 1.7.2 or higher needs to be installed on the cluster
1212
3. The Open Data Hub operator needs to be installed
13-
4. The default installation namespace for ML Pipelines is `odh-applications`. This namespace will need to be created. In case you wish to install in a custom location, create it and update the kfdef as documented below.
13+
4. The default installation namespace for Data Science Pipelines is `odh-applications`. This namespace will need to be created. In case you wish to install in a custom location, create it and update the kfdef as documented below.
1414

1515
### Installation Steps
1616

1717
1. Ensure that the prerequisites are met.
18-
2. Apply the kfdef at [kfctl_openshift_ml-pipelines.yaml](https://github.com/opendatahub-io/odh-manifests/blob/master/kfdef/kfctl_openshift_ml-pipelines.yaml). You may need to update the `namespace` field under `metadata` in case you want to deploy in a namespace that isn't `odh-applications`.
19-
3. To find the url for ML pipelines, you can run the following command.
18+
2. Apply the kfdef at [kfctl_openshift_ds-pipelines.yaml](https://github.com/opendatahub-io/odh-manifests/blob/master/kfdef/kfctl_openshift_ds-pipelines.yaml). You may need to update the `namespace` field under `metadata` in case you want to deploy in a namespace that isn't `odh-applications`.
19+
3. To find the url for Data Science pipelines, you can run the following command.
2020
```bash
21-
$ oc get route -n <kdef_namespace> ml-pipeline-ui -o jsonpath='{.spec.host}'
21+
$ oc get route -n <kdef_namespace> ds-pipeline-ui -o jsonpath='{.spec.host}'
2222
```
2323
The value of `<kfdef_namespace>` should match the namespace field of the kfdef that you applied.
2424
4. Alternatively, you can access the route via the console. To do so:
2525

2626
1. Go to `<kfdef_namespace>`
2727
2. Click on `Networking` in the sidebar on the left side.
2828
3. Click on `Routes`. It will take you to a new page in the console.
29-
4. Click the url under the `Location` column for the row item matching `ml-pipeline-ui`
29+
4. Click the url under the `Location` column for the row item matching `ds-pipeline-ui`
3030

3131

3232
## Directory Structure
3333

3434
### Base
3535

36-
This directory contains artifacts for deploying all backend components of ML Pipelines. This deployment currently includes the kfp-tekton backend as well as a Minio deployment to act as an object store. The Minio deployment will be moved to an overlay at some point in the near future.
36+
This directory contains artifacts for deploying all backend components of Data Science Pipelines. This deployment currently includes the kfp-tekton backend as well as a Minio deployment to act as an object store. The Minio deployment will be moved to an overlay at some point in the near future.
3737

3838
### Overlays
3939

40-
1. metadata-store-mysql: This overlay contains artifacts for deploying a MySQL database. MySQL is currently the only supported backend for ML Pipelines, so if you don't have an existing MySQL database deployed, this overlay needs to be applied.
41-
2. metadata-store-postgresql: This overlay contains artifacts for deploying a PostgreSQL database. ML Pipelines does not currently support PostgreSQL as a backend, so deploying this overlay will not actually modify ML Pipelines behaviour.
42-
3. ml-pipeline-ui: This overlay contains deployment artifacts for the ML Pipelines UI. Deploying ML Pipelines without this overlay will result in only the backend artifacts being created.
43-
4. object-store-minio: This overlay contains artifacts for deploying Minio as the Object Store to store Pipelines artifacts.
40+
1. metadata-store-mariadb: This overlay contains artifacts for deploying a MariaDB database. MySQL-based databases are currently the only supported backend for Data Science Pipelines, so if you don't have an existing MySQL database deployed, this overlay can be applied to satisfy the requirement.
41+
2. metadata-store-mysql: This overlay contains artifacts for deploying a MySQL database. MySQL-based databases are currently the only supported backend for Data Science Pipelines, so if you don't have an existing MySQL database deployed, this overlay can be applied to satisfy the requirement.
42+
3. metadata-store-postgresql: This overlay contains artifacts for deploying a PostgreSQL database. Data Science Pipelines does not currently support PostgreSQL as a backend, so deploying this overlay will not actually modify Data Science Pipelines behaviour.
43+
4. ds-pipeline-ui: This overlay contains deployment artifacts for the Data Science Pipelines UI. Deploying Data Science Pipelines without this overlay will result in only the backend artifacts being created.
44+
5. object-store-minio: This overlay contains artifacts for deploying Minio as the Object Store to store Pipelines artifacts.
45+
6. default-configs: This overlay creates ConfigMaps and Secrets with default values for a deployment with both a local MySQL database and Minio object store. *Note*: Using this overlay allows for a simple and quick setup, but also marks the configs as managed objects when used with the ODH Operator, which will reconcile any post-deployment changes made, and cannot be overridden.
46+
7. integration-odhdashboard: Adds resources required to integrate the Data Science Pipelines application into the ODH Dashboard UI, such as documentation and application launcher tiles.
47+
8. component-mlmd: Adds the ML-Metadata component which provides artifact lineage tracking in the UI.
4448

4549
### Prometheus
4650

47-
This directory contains the service monitor definition for ML Pipelines. It is always deployed by base, so this will eventually be moved into the base directory itself.
51+
This directory contains the service monitor definition for Data Science Pipelines. It is always deployed by base, so this will eventually be moved into the base directory itself.
4852

4953
## Parameters
5054

51-
You can customize the ML Pipelines deployment by injecting custom parameters to change the default deployment. The following parameters can be used:
55+
You can customize the Data Science Pipelines deployment by injecting custom parameters to change the default deployment. The following parameters can be used:
5256

53-
* **pipeline_install_configuration**: The ConfigMap name that contains the values to install the ML Pipelines environment. This parameter defaults to `pipeline-install-config` and you can find an example in the [repository](./base/configmaps/pipeline-install-config.yaml).
54-
* **ml_pipelines_configuration**: The ConfigMap name that contains the values to integrate ML Pipelines with the underlying components (Database and Object Store). This parameter defaults to `kfp-tekton-config` and you can find an example in the [repository](./base/configmaps/kfp-tekton-config.yaml).
55-
* **database_secret**: The secret that contains the credentials for the ML Pipelines Databse. It defaults to `mysql-secret` if using the `metadata-store-mysql` overlay or `postgresql-secret` if using the `metadata-store-postgresql` overlay.
56-
* **ml_pipelines_ui_configuration**: The ConfigMap that contains the values to customize UI. It defaults to `ml-pipeline-ui-configmap`.
57+
* **pipeline_install_configuration**: The ConfigMap name that contains the values to install the Data Science Pipelines environment. This parameter defaults to `pipeline-install-config` and you can find an example in the [repository](./base/configmaps/pipeline-install-config.yaml).
58+
* **ds_pipelines_configuration**: The ConfigMap name that contains the values to integrate Data Science Pipelines with the underlying components (Database and Object Store). This parameter defaults to `kfp-tekton-config` and you can find an example in the [repository](./base/configmaps/kfp-tekton-config.yaml).
59+
* **database_secret**: The secret that contains the credentials for the Data Science Pipelines Databse. It defaults to `mysql-secret` if using the `metadata-store-mysql` overlay or `postgresql-secret` if using the `metadata-store-postgresql` overlay.
60+
* **ds_pipelines_ui_configuration**: The ConfigMap that contains the values to customize UI. It defaults to `ds-pipeline-ui-configmap`.
5761

5862
## Configuration
5963

6064
* It is possible to configure what S3 storage is being used by Pipeline Runs. Detailed instructions on how to configure this will be added once Minio is moved to an overlay.
6165

6266
## Usage
6367

64-
### These instructions will be updated once ML Pipelines has a tile available in odh-dashboard
68+
### These instructions will be updated once Data Science Pipelines has a tile available in odh-dashboard
6569

66-
1. Go to the ml-pipelines-ui route.
70+
1. Go to the ds-pipelines-ui route.
6771
2. Click on `Pipelines` on the left side.
6872
3. There will be a `[Demo] flip-coin` Pipeline already available. Click on it.
6973
4. Click on the blue `Create run` button towards the top of the screen.
@@ -73,6 +77,6 @@ You can customize the ML Pipelines deployment by injecting custom parameters to
7377
8. Once the Pipeline is done running, you can see a graph of all the pods that were created as well as the paths that were followed.
7478
9. For further verification, you can view all the pods that were created as part of the Pipeline Run in the `<kfdef_namespace>`. They will all show up as `Completed`.
7579

76-
## ML Pipelines Architecture
80+
## Data Science Pipelines Architecture
7781

78-
A complete architecture can be found at [ODH ML Pipelines Architecture and Design](https://docs.google.com/document/d/1o-JS1uZKLZsMY3D16kl5KBdyBb-aV-kyD_XycdJOYpM/edit#heading=h.3aocw3evrps0). This document will be moved to GitHub once the corresponding ML Ops SIG repos are created.
82+
A complete architecture can be found at [ODH Data Science Pipelines Architecture and Design](https://docs.google.com/document/d/1o-JS1uZKLZsMY3D16kl5KBdyBb-aV-kyD_XycdJOYpM/edit#heading=h.3aocw3evrps0). This document will be moved to GitHub once the corresponding ML Ops SIG repos are created.

0 commit comments

Comments
 (0)