Skip to content

Commit 18d2009

Browse files
committed
Incorporated Judy's comments and Ben's
1 parent 4d65c06 commit 18d2009

File tree

5 files changed

+15
-18
lines changed

5 files changed

+15
-18
lines changed

modules/openshift-ai-connector-for-rhdh/proc-populating-the-api-definition-tab.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
:_mod-docs-content-type: PROCEDURE
22

33
[id="proc-populating-the-api-definition-tab_{context}"]
4-
= Populating the API Definition tab
4+
= Populating the API Definition tab in {product-very-short} API entities
55

6-
The AI platform engineer must follow these steps to provide this valuable information because {rhoai-short} does not expose the OpenAPI specification by default.
6+
Since {rhoai-short} does not expose the OpenAPI specification by default, the AI platform engineer can take the following steps to provide this valuable information:
77

88
.Procedure
99

@@ -24,4 +24,4 @@ We recommend using *Model Version* instead of *Registered Model* to maintain sta
2424

2525
.. In the **Properties** section, set a key/value pair where the key is `API Spec` and the value is the entire JSON content from the `open-api.json` file.
2626

27-
. Propagation: The {openshift-ai-connector-name} periodically polls the {rhoai-short} Model Registry, propagates this JSON, and renders the interactive API documentation in the {product-very-short} API Entity *Definition* tab.
27+
. Propagation: The {openshift-ai-connector-name} periodically polls the {rhoai-short} Model Registry, propagates this JSON, and renders the interactive API documentation in the *Definition* tab of the {product-very-short} API entity.

modules/openshift-ai-connector-for-rhdh/proc-troubleshooting-connector-functionality.adoc

Lines changed: 3 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
[id="proc-troubleshooting-connector-functionality_{context}"]
44
= Troubleshooting Connector functionality
55

6-
The connector system consists of the two dynamic plugins and the three model catalog bridge sidecar containers. Generally speaking, the logs collected must be provided to {company-name} Support for analysis.
6+
The connector system consists of the two dynamic plugins and the three {openshift-ai-connector-name-short} sidecar containers. Generally speaking, the logs collected must be provided to {company-name} Support for analysis.
77

88
The actual contents of the diagnostic data are not part of any product guaranteed specification, and can change at any time.
99

@@ -40,14 +40,9 @@ View the {openshift-ai-connector-name}plugins in the `backstage-backend` contain
4040

4141
To enable debug logging, set the `LOG_LEVEL` environment variable to `debug` on the `backstage-backend` container. For more information, see {monitoring-and-logging-book-link}[{monitoring-and-logging-book-title}].
4242

43-
== Inspecting the Model Catalog Bridge
43+
== Inspecting the {openshift-ai-connector-name-short}
4444

45-
The Model Catalog Bridge sidecars manage the data fetching and storage:
46-
47-
[IMPORTANT]
48-
====
49-
{openshift-ai-connector-name} collects feedback from users who engage with the feedback feature. If a user submits feedback, the feedback score (thumbs up or down), text feedback (if entered), the user query, and the LLM provider response are stored locally in the file system of the Pod. {company-name} does not have access to the collected feedback data.
50-
====
45+
The {openshift-ai-connector-name-short} sidecars manage the data fetching and storage:
5146

5247
. Check Cached Data (ConfigMap): The processed AI Model metadata is stored in a `ConfigMap`.
5348
+

modules/openshift-ai-connector-for-rhdh/ref-enrich-ai-model-metadata.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
[id="ref-enrich-ai-model-metadata_{context}"]
44
= Enrich AI model metadata for enhanced {product} experience
55

6-
While {rhoai-short} provides essential data, an AI platform engineer using {rhoai-short} can enrich the {backstage}/{product-very-short} experience by adding custom properties to the `ModelVersion` or `RegisteredModel` (or annotations to the `KServe InferenceService` if the model registry is not used) so that the {openshift-ai-connector-name} can add the information to the {product-very-short} entities it creates. For more details, see {rhoai-docs-link}/working_with_model_registries/index#editing-model-version-metadata-in-a-model-registry_model-registry[Editing model version metadata in a model registry].
6+
While {rhoai-short} provides essential data, an AI platform engineer using {rhoai-short} can enrich the {backstage}/{product-very-short} experience by adding `custom properties` to the `ModelVersion` or `RegisteredModel` (or annotations to the `KServe InferenceService` if the model registry is not used) so that the {openshift-ai-connector-name} can add the information to the {product-very-short} entities it creates. For more details, see {rhoai-docs-link}/working_with_model_registries/index#editing-model-version-metadata-in-a-model-registry_model-registry[Editing model version metadata in a model registry].
77

88
|===
99
|Property Key |Entity Field Populated |Description

modules/openshift-ai-connector-for-rhdh/ref-model-to-entity-mapping.adoc

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,8 @@
33
[id="ref-model-to-entity-mapping_{context}"]
44
= Model-to-Entity mapping
55

6+
{openshift-ai-connector-name-short} integrates with {openshift-ai-connector-name-short}, the model catalog, and KServe-based Model Deployments (InferenceServices). This integration automatically converts your AI/ML artifacts into familiar {backstage} entities, simplifying management and providing a unified view of your models.
7+
68
This offering interfaces with the {openshift-ai-connector-name-short}, model catalog, and KServe-based Model Deployments (InferenceServices) to create familiar {backstage} entities.
79

810
|===

modules/openshift-ai-connector-for-rhdh/ref-openshift-ai-model-registry-and-model-catalog-queries.adoc

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,42 +5,42 @@
55

66
To access the same {rhoai-short} data as the connector, use `curl` to query the {rhoai-short} model registry and model catalog APIs, ensuring the `ServiceAccount` token has correct access control:
77

8-
* Example: Fetch registered models
8+
* Example showing how to fetch registered models
99
+
1010
[source,bash]
1111
----
1212
curl -k -H "Authorization: Bearer $TOKEN" $RHOAI_MODEL_REGISTRY_URL/api/model_registry/v1alpha3/registered_models | jq
1313
----
1414

15-
* Example: Fetch model versions
15+
* Example showing how to fetch model versions
1616
+
1717
[source,bash]
1818
----
1919
curl -k -H "Authorization: Bearer $TOKEN" $RHOAI_MODEL_REGISTRY_URL/api/model_registry/v1alpha3/model_versions | jq
2020
----
2121

22-
* Example: Fetch model artifacts
22+
* Example showing how to fetch model artifacts
2323
+
2424
[source,bash]
2525
----
2626
curl -k -H "Authorization: Bearer $TOKEN" $RHOAI_MODEL_REGISTRY_URL/api/model_registry/v1alpha3/model_artifacts | jq
2727
----
2828

29-
* Example: Fetch inference services
29+
* Example showing how to fetch inference services
3030
+
3131
[source,bash]
3232
----
3333
curl -k -H "Authorization: Bearer $TOKEN" $RHOAI_MODEL_REGISTRY_URL/api/model_registry/v1alpha3/inference_services | jq
3434
----
3535

36-
* Example: Fetch serving environments
36+
* Example showing how to fetch serving environments
3737
+
3838
[source,bash]
3939
----
4040
curl -k -H "Authorization: Bearer $TOKEN" $RHOAI_MODEL_REGISTRY_URL/api/model_registry/v1alpha3/serving_environments | jq
4141
----
4242

43-
* Example: Fetch catalog sources
43+
* Example showing how to fetch catalog sources
4444
+
4545
[source,bash]
4646
----

0 commit comments

Comments
 (0)