Skip to content

Commit be10050

Browse files
committed
RHAI-ENG-306-modify-docs-on-deploying-llamastackdistribution-instance - modified distribution image name in example yaml and modified prereq
1 parent 46e0e7e commit be10050

File tree

2 files changed

+2
-10
lines changed

2 files changed

+2
-10
lines changed

modules/deploying-a-llama-model-with-kserve.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ To use Llama Stack and retrieval-augmented generation (RAG) workloads in {produc
1010

1111
* You have logged in to {productname-long}.
1212
* You have cluster administrator privileges for your {openshift-platform} cluster.
13-
* You have installed the Llama Stack Operator.
13+
* You have activated the Llama Stack Operator.
1414
ifdef::upstream[]
1515
For more information, see link:{odhdocshome}/working-with-rag/#installing-the-llama-stack-operator_rag[Installing the Llama Stack Operator].
1616
endif::[]

modules/deploying-a-llamastackdistribution-instance.adoc

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,6 @@
66
[role='_abstract']
77
You can integrate LlamaStack and its retrieval-augmented generation (RAG) capabilities with your deployed Llama 3.2 model served by vLLM. This integration enables you to build intelligent applications that combine large language models (LLMs) with real-time data retrieval, providing more accurate and contextually relevant responses for your AI workloads.
88

9-
When you create a `LlamaStackDistribution` custom resource (CR), specify the Llama Stack image `quay.io/opendatahub/llama-stack:odh` in the `spec.server.distribution.image` field. The image is hosted on link:https://quay.io[Quay.io], a secure registry that provides vulnerability scanning, role‑based access control, and globally distributed content delivery. Using this {org-name}–validated image ensures that your deployment automatically receives the latest security patches and compatibility updates. For more information about working with Quay.io, see link:https://docs.redhat.com/en/documentation/red_hat_quay/3/html/about_quay_io/quayio-overview[Quay.io overview].
10-
11-
[IMPORTANT]
12-
====
13-
The Llama Stack image is hosted on link:https://quay.io[Quay.io] only during the Developer Preview phase of the Llama Stack integration with {productname-short}. When the Llama Stack integration reaches general availability, the image will be available on link:https://registry.redhat.io[registry.redhat.io].
14-
====
15-
169
ifdef::self-managed[]
1710
ifdef::disconnected[]
1811
If your cluster cannot pull images directly from public registries, first mirror the image to your local registry. For more information, see link:https://docs.redhat.com/en/documentation/openshift_container_platform/{ocp-latest-version}/html/disconnected_environments/mirroring-in-disconnected-environments#mirroring-images-disconnected-install[Mirroring images for disconnected installation] in the OpenShift documentation.
@@ -42,7 +35,6 @@ endif::[]
4235

4336
.Procedure
4437

45-
4638
. Open a new terminal window.
4739
.. Log in to your {openshift-platform} cluster from the CLI:
4840
.. In the upper-right corner of the OpenShift web console, click your user name and select *Copy login command*.
@@ -119,7 +111,7 @@ spec:
119111
name: llama-stack
120112
port: 8321
121113
distribution:
122-
image: quay.io/opendatahub/llama-stack:odh
114+
image: rh-dev
123115
storage:
124116
size: "5Gi"
125117
----

0 commit comments

Comments
 (0)