You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: modules/openshift-ai-connector-for-rhdh/proc-setting-up-openshift-ai-connector-for-rhdh-with-rhoai.adoc
+8-5Lines changed: 8 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ The installation of the {openshift-ai-connector-name} requires manual updates to
7
7
8
8
.{rhoai-short} Prerequisites
9
9
10
-
* To have Model Cards from the Model Catalog imported as Tech Docs, you must use {rhoai-short} version 2.25 or later, and the Model Catalog dashboard and a Model Registry need to be enabled (they are both off by default).
10
+
* To have Model Cards from the Model Catalog imported as TechDocs, you must use {rhoai-short} version 2.25 or later, and the *Model Catalog* dashboard and a Model Registry need to be enabled (they are both disabled by default).
11
11
* If you employed Model Catalog at earlier versions of {rhoai-short}, Tech Doc propagation does not work for any models you registered into the Model Registry while at those earlier versions; only models registered into Model Registry from a 2.25 (or later) Model Catalog have their Model Cards transferred to {product-very-short} as TechDocs.
12
12
* For the rest of the features, version 2.20 or later suffices. Enabling Model Registry and its associated dashboard allows for a user experience that more directly allows for customizing AI Model metadata.
. Add Connector sidecar containers to the {product-very-short} Pod.
40
-
The system relies on three sidecar containers (Model Catalog Bridge) running alongside the `backstage-backend` container. These sidecar containers must be added to your {product-very-short} deployment specification, referencing the `rhdh-rhoai-bridge-token` Secret:
41
-
** `location`: Provides the REST API for RHDH plugins to fetch model metadata.
39
+
. Add `Connector` sidecar containers to the {product-very-short} Pod.
40
+
** If {product-very-short} was installed using the Operator, modify your {product-very-short} custom resource (CR) instance.
41
+
** If {product-very-short} was installed using the Helm charts, modify the *Deployment* specification.
42
+
43
+
. The system relies on three sidecar containers (Model Catalog Bridge) running alongside the `backstage-backend` container. Add these sidecar containers to your configuration referencing the `rhdh-rhoai-bridge-token` Secret:
44
+
** `location`: Provides the REST API for {product-very-short} plugins to fetch model metadata.
42
45
** `storage-rest`: Maintains a cache of AI Model metadata in a ConfigMap called `bac-import-model`.
43
-
** `rhoai-normalizer`: Acts as a Kubernetes controller and RHOAI client, normalizing RHOAI metadata for the connector. The following code block is an example:
46
+
** `rhoai-normalizer`: Acts as a Kubernetes controller and {rhoai-short} client, normalizing {rhoai-short} metadata for the connector. The following code block is an example:
Copy file name to clipboardExpand all lines: modules/openshift-ai-connector-for-rhdh/proc-troubleshooting-connector-functionality.adoc
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -42,6 +42,11 @@ To enable debug logging, set the `LOG_LEVEL` environment variable to `debug` on
42
42
43
43
The Model Catalog Bridge sidecars manage the data fetching and storage:
44
44
45
+
[IMPORTANT]
46
+
====
47
+
{openshift-ai-connector-name} collects feedback from users who engage with the feedback feature. If a user submits feedback, the feedback score (thumbs up or down), text feedback (if entered), the user query, and the LLM provider response are stored locally in the file system of the Pod. {company-name} does not have access to the collected feedback data.
48
+
====
49
+
45
50
. Check Cached Data (ConfigMap): The processed AI Model metadata is stored in a `ConfigMap`.
Copy file name to clipboardExpand all lines: modules/openshift-ai-connector-for-rhdh/ref-enrich-ai-model-metadata.adoc
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ While {rhoai-short} provides essential data, an AI platform engineer can enrich
18
18
19
19
|`TechDocs`
20
20
|TechDocs
21
-
|URL pointing to a Git repository that follows {product-very-short} TechDocs conventions for the Model Card.
21
+
|URL pointing to a Git repository that follows {product-very-short} TechDocs conventions for the Model Card. Use this setting only if the *Model Card to TechDocs* mapping is not active.
0 commit comments