Skip to content

Commit 47034aa

Browse files
authored
RHDHBUGS-2051: Adding a new topic on adding a new llm provider (#1403)
* First draft * Included content for adding a new llm provider * Added k8s secret * Incorporated Jordan's comment * Incorporated Judy's comments
1 parent c315e33 commit 47034aa

File tree

4 files changed

+73
-0
lines changed

4 files changed

+73
-0
lines changed

artifacts/attributes.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -116,6 +116,7 @@
116116
:customizing-book-link: {product-docs-link}/html-single/customizing_red_hat_developer_hub/index
117117
:customizing-book-title: Customizing {product}
118118
:default-helm-chart-values-link: link:https://github.com/redhat-developer/rhdh-chart/blob/release-{product-version}/charts/backstage/values.yaml
119+
:developer-lightspeed-link: {product-docs-link}/html-single/interacting_with_red_hat_developer_lightspeed_for_red_hat_developer_hub/index
119120
:discover-category-link: {product-docs-link}/#Discover
120121
:dynamic-plugins-default-yaml-link: link:https://github.com/redhat-developer/rhdh/blob/release-{product-version}/dynamic-plugins.default.yaml
121122
:dynamic-plugins-reference-book-link: {product-docs-link}/html-single/dynamic_plugins_reference/index

assemblies/assembly-customizing-developer-lightspeed.adoc

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,3 +13,5 @@ include::modules/developer-lightspeed/proc-gathering-feedback.adoc[leveloffset=+
1313
include::modules/developer-lightspeed/proc-updating-the-system-prompt.adoc[leveloffset=+1]
1414

1515
include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1]
16+
17+
include::modules/developer-lightspeed/proc-changing-your-llm-provider.adoc[leveloffset=+1]
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
:_mod-docs-content-type: PROCEDURE
2+
3+
[id="proc-changing-your-llm-provider_{context}"]
4+
= Changing your LLM provider in {ls-short}
5+
6+
{ls-short} operates on a link:{developer-lightspeed-link}##con-about-bring-your-own-model_appendix-about-user-data-security[_Bring Your Own Model_] approach, meaning you must provide and configure access to your preferred Large Language Model (LLM) provider for the service to function. The Road-Core Service (RCS) acts as an intermediary layer that handles the configuration and setup of these LLM providers.
7+
8+
[IMPORTANT]
9+
====
10+
The LLM provider configuration section includes a mandatory dummy provider block. Due to limitations of Road Core, this dummy provider must remain present when working with Lightspeed. This block is typically marked with comments (# Start: Do not remove this block and # End: Do not remove this block) and must not be removed from the configuration file.
11+
====
12+
13+
.Prerequisites
14+
15+
* The path to the file containing your API token must be accessible by the RCS container, requiring the file to be mounted to the RCS container.
16+
17+
.Procedure
18+
19+
You can define additional LLM providers using either of following methods:
20+
21+
* Recommended: In your Developer Lightspeed plugin configuration (the `lightspeed` section within the `lightspeed-app-config.yaml` file), define the new provider or providers under the `lightspeed.servers` key as shown in the following code:
22+
+
23+
[source,yaml]
24+
----
25+
lightspeed:
26+
servers:
27+
- id: _<my_new_provider>_
28+
url: _<my_new_url>_
29+
token: _<my_new_token>_
30+
----
31+
** Optional: You can set the `id`, `url`, and `token` values in a Kubernetes Secret and reference them as environment variables using the `envFrom` section.
32+
[source,yaml]
33+
----
34+
containers:
35+
- name: my-container
36+
image: my-image
37+
envFrom:
38+
- secretRef:
39+
name: my-secret
40+
----
41+
42+
* You can add new LLM providers by updating the `rcsconfig.yaml` file.
43+
.. In the `llm_providers` section within your `rcsconfig.yaml` file, add your new provider configuration below the mandatory dummy provider block as shown in the following code:
44+
+
45+
[source,yaml]
46+
----
47+
llm_providers:
48+
# Start: Do not remove this block
49+
- name: dummy
50+
type: openai
51+
url: https://dummy.com
52+
models:
53+
- name: dummymodel
54+
# END: Do not remove this block
55+
- name: _<my_new_providers>_
56+
type: openai
57+
url: _<my_provider_url>_
58+
credentials_path: path/to/token
59+
disable_model_check: true
60+
----
61+
.. If you need to define a new provider in `rcsconfig.yaml`, you must configure the following critical parameters:
62+
** `credentials_path`: Specifies the path to a `.txt` file that contains your API token. This file must be mounted and accessible by the RCS container.
63+
** `disable_model_check`: Set this field to `true` to allow the RCS to locate models through the `/v1/models` endpoint of the provider. When you set this field to `true`, you avoid the need to define model names explicitly in the configuration.

modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,11 +28,13 @@ metadata:
2828
data:
2929
rcsconfig.yaml: |
3030
llm_providers:
31+
# Start: Do not remove this block
3132
- name: dummy
3233
type: openai
3334
url: https://dummy.com
3435
models:
3536
- name: dummymodel
37+
# End: Do not remove this block
3638
ols_config:
3739
user_data_collection:
3840
log_level: "DEBUG"
@@ -66,6 +68,11 @@ data:
6668
user_agent: "example-user-agent"
6769
ingress_url: "https://example.ingress.com/upload"
6870
----
71+
+
72+
[IMPORTANT]
73+
====
74+
Do not remove the block in the `llm_providers` section. This requirement is crucial when working with {ls-short} because of limitations discovered in Road Core. If you decide to use an alternative LLM provider, you should refer to additional information in the guide. For more information, see link:{developer-lightspeed-link}#proc-changing-your-llm-provider[Changing your LLM provider].
75+
====
6976
.. Optional: Configure the number of workers that scale the REST API by specifying the following example to the `ols_config.max_workers` parameter in the {rcs-short} ConfigMap.
7077
+
7178
[source,yaml]

0 commit comments

Comments
 (0)