Skip to content

Commit 5c2d542

Browse files
authored
RHOAI-26067 DSP reqs for FIPS (#803)
1 parent bdc482d commit 5c2d542

File tree

2 files changed

+6
-0
lines changed

2 files changed

+6
-0
lines changed

modules/defining-a-pipeline.adoc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,11 @@
66
[role='_abstract']
77
The Kubeflow Pipelines SDK enables you to define end-to-end machine learning and data pipelines. Use the latest Kubeflow Pipelines 2.0 SDK to build your data science pipeline in Python code. After you have built your pipeline, use the SDK to compile it into an Intermediate Representation (IR) YAML file. After defining the pipeline, you can import the YAML file to the {productname-short} dashboard to enable you to configure its execution settings.
88

9+
[IMPORTANT]
10+
====
11+
If you are using {productname-short} on a cluster running in FIPS mode, any custom container images for data science pipelines must be based on UBI 9 or RHEL 9. This ensures compatibility with FIPS-approved pipeline components and prevents errors related to mismatched OpenSSL or GNU C Library (glibc) versions.
12+
====
13+
914
ifdef::upstream[]
1015
You can also use the Elyra JupyterLab extension to create and run data science pipelines within JupyterLab. For more information about the Elyra JupyterLab extension, see link:https://elyra.readthedocs.io/en/stable/getting_started/overview.html[Elyra Documentation].
1116
endif::[]

modules/installing-the-odh-operator-v2.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ Argo Workflows resources that are created by {productname-short} have the follow
2626
app.opendatahub.io/data-science-pipelines-operator: 'true'
2727
----
2828
====
29+
* If you are using {productname-short} on a cluster running in FIPS mode, any custom container images for data science pipelines must be based on UBI 9 or RHEL 9. This ensures compatibility with FIPS-approved pipeline components and prevents errors related to mismatched OpenSSL or GNU C Library (glibc) versions.
2930

3031
.Procedure
3132
. Log in to your OpenShift Container Platform as a user with `cluster-admin` privileges. If you are performing a developer installation on link:http://try.openshift.com[try.openshift.com], you can log in as the `kubeadmin` user.

0 commit comments

Comments
 (0)