Skip to content

Commit 98e88bc

Browse files
committed
chore: separate yaml files for rhoai demo
1 parent aa0d6e0 commit 98e88bc

File tree

3 files changed

+14
-16
lines changed

3 files changed

+14
-16
lines changed

demos/llama-stack-openshift/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ oc create secret generic llama-stack-inference-model-secret \
2323

2424
## Setup Deployment files
2525
### Configuring the `kubeflow-ragas-config` ConfigMap
26-
Update the [kubeflow-ragas-config](deployment/llama-stack-distribution.yaml) with the following data:
26+
Update the [kubeflow-ragas-config](deployment/kubeflow-ragas-config.yaml) with the following data:
2727
``` bash
2828
# See project README for more details
2929
EMBEDDING_MODEL=all-MiniLM-L6-v2
@@ -49,7 +49,7 @@ kubectl create secret generic kubeflow-pipelines-token \
4949
```
5050

5151
## Deploy Llama Stack on OpenShift
52-
You can now deploy the configuration files and the Llama Stack distribution with `oc apply -f deployment/llama-stack-distribution.yaml`
52+
You can now deploy the configuration files and the Llama Stack distribution with `oc apply -f deployment/kubeflow-ragas-config.yaml` and `oc apply -f deployment/llama-stack-distribution.yaml`
5353

5454
You should now have a Llama Stack server on OpenShift with the remote ragas eval provider configured.
5555
You can now follow the [remote_demo.ipynb](../../demos/remote_demo.ipynb) demo but ensure you are running it in a Data Science workbench and use the `LLAMA_STACK_URL` defined earlier. Alternatively you can run it locally if you create a Route.
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
apiVersion: v1
2+
kind: ConfigMap
3+
metadata:
4+
name: kubeflow-ragas-config
5+
data:
6+
EMBEDDING_MODEL: "all-MiniLM-L6-v2"
7+
KUBEFLOW_LLAMA_STACK_URL: "<your-llama-stack-url>"
8+
KUBEFLOW_PIPELINES_ENDPOINT: "<your-kfp-endpoint>"
9+
KUBEFLOW_NAMESPACE: "<your-namespace>"
10+
KUBEFLOW_BASE_IMAGE: "quay.io/diegosquayorg/my-ragas-provider-image:latest"
11+
KUBEFLOW_RESULTS_S3_PREFIX: "s3://my-bucket/ragas-results"
12+
KUBEFLOW_S3_CREDENTIALS_SECRET_NAME: "<secret-name>"

demos/llama-stack-openshift/deployment/llama-stack-distribution.yaml

Lines changed: 0 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,3 @@
1-
---
2-
apiVersion: v1
3-
kind: ConfigMap
4-
metadata:
5-
name: kubeflow-ragas-config
6-
data:
7-
EMBEDDING_MODEL: "all-MiniLM-L6-v2"
8-
KUBEFLOW_LLAMA_STACK_URL: "<your-llama-stack-url>"
9-
KUBEFLOW_PIPELINES_ENDPOINT: "<your-kfp-endpoint>"
10-
KUBEFLOW_NAMESPACE: "<your-namespace>"
11-
KUBEFLOW_BASE_IMAGE: "quay.io/diegosquayorg/my-ragas-provider-image:latest"
12-
KUBEFLOW_RESULTS_S3_PREFIX: "s3://my-bucket/ragas-results"
13-
KUBEFLOW_S3_CREDENTIALS_SECRET_NAME: "<secret-name>"
14-
---
151
apiVersion: llamastack.io/v1alpha1
162
kind: LlamaStackDistribution
173
metadata:

0 commit comments

Comments
 (0)