Skip to content

Commit 6846773

Browse files
Merge pull request opendatahub-io#113 from HumairAK/add_kfdef
Add a kfdef that deploys latest dspo.
2 parents 86f66b2 + 27ef776 commit 6846773

File tree

2 files changed

+64
-0
lines changed

2 files changed

+64
-0
lines changed

kfdef/README.md

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
# Deploy latest DSPO via ODH
2+
3+
To deploy the latest DSPO using the changes within this repo via Open Data Hub you can follow these steps
4+
5+
6+
## Pre-requisites
7+
1. An OpenShift cluster that is 4.10 or higher.
8+
2. You will need to be logged into this cluster as [cluster admin] via [oc client].
9+
3. The OpenShift Cluster must have OpenShift Pipelines 1.9 or higher installed. Instructions [here][OCP Pipelines Operator].
10+
4. The Open Data Hub operator needs to be installed. You can install it via [OperatorHub][installodh].
11+
12+
13+
## Deploy Kfdef
14+
15+
Clone this repository then run the following commands:
16+
17+
```bash
18+
# If this namespace does not exist
19+
oc new-project odh-applications
20+
21+
# Then run
22+
oc apply -f https://raw.githubusercontent.com/opendatahub-io/data-science-pipelines-operator/main/kfdef/kfdef.yaml -n odh-applications
23+
```
24+
25+
Once done, follow the steps outlined [here][dspa] to get started with deploying your own
26+
`DataSciencePipelinesApplication` with the latest changes found within this repository.
27+
28+
[kfdef]: https://github.com/opendatahub-io/data-science-pipelines-operator/blob/main/kfdef/kfdef.yaml
29+
[cluster admin]: https://docs.openshift.com/container-platform/4.12/authentication/using-rbac.html#creating-cluster-admin_using-rbac
30+
[oc client]: https://mirror.openshift.com/pub/openshift-v4/x86_64/clients/ocp/latest/openshift-client-linux.tar.gz
31+
[OCP Pipelines Operator]: https://docs.openshift.com/container-platform/4.12/cicd/pipelines/installing-pipelines.html#op-installing-pipelines-operator-in-web-console_installing-pipelines
32+
[installodh]: https://opendatahub.io/docs/getting-started/quick-installation.html
33+
[dspa]: https://github.com/opendatahub-io/data-science-pipelines-operator#deploy-dsp-instance

kfdef/kfdef.yaml

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
apiVersion: kfdef.apps.kubeflow.org/v1
2+
kind: KfDef
3+
metadata:
4+
name: data-science-pipelines-operator
5+
spec:
6+
applications:
7+
- kustomizeConfig:
8+
parameters:
9+
- name: IMAGES_APISERVER
10+
value: quay.io/opendatahub/ds-pipelines-api-server:main
11+
- name: IMAGES_ARTIFACT
12+
value: quay.io/opendatahub/ds-pipelines-artifact-manager:main
13+
- name: IMAGES_PERSISTENTAGENT
14+
value: quay.io/opendatahub/ds-pipelines-persistenceagent:main
15+
- name: IMAGES_SCHEDULEDWORKFLOW
16+
value: quay.io/opendatahub/ds-pipelines-scheduledworkflow:main
17+
- name: IMAGES_CACHE
18+
value: registry.access.redhat.com/ubi8/ubi-minimal:8.7
19+
- name: IMAGES_DSPO
20+
value: quay.io/opendatahub/data-science-pipelines-operator:main
21+
- name: IMAGES_MOVERESULTSIMAGE
22+
value: registry.access.redhat.com/ubi8/ubi-micro:8.7
23+
- name: IMAGES_MARIADB
24+
value: registry.redhat.io/rhel8/mariadb-103:1-188
25+
repoRef:
26+
name: manifests
27+
path: config
28+
name: data-science-pipelines-operator
29+
repos:
30+
- name: manifests
31+
uri: "https://github.com/opendatahub-io/data-science-pipelines-operator/tarball/main"

0 commit comments

Comments
 (0)