Skip to content

Commit d87e17e

Browse files
authored
Manual cherrypick for PR-1403 (#1407)
* First draft * Fixing merge conflicts * Staging files
1 parent ff2bc88 commit d87e17e

File tree

4 files changed

+64
-1
lines changed

4 files changed

+64
-1
lines changed

artifacts/attributes.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,7 @@
115115
:control-access-category-link: {product-docs-link}/#Control access
116116
:customizing-book-link: {product-docs-link}/html-single/customizing_red_hat_developer_hub/index
117117
:customizing-book-title: Customizing {product}
118+
:developer-lightspeed-link: {product-docs-link}/html-single/interacting_with_red_hat_developer_lightspeed_for_red_hat_developer_hub/index
118119
:discover-category-link: {product-docs-link}/#Discover
119120
:dynamic-plugins-reference-book-link: {product-docs-link}/html-single/dynamic_plugins_reference/index
120121
:dynamic-plugins-reference-book-title: Dynamic plugins reference

assemblies/assembly-customizing-developer-lightspeed.adoc

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,4 +11,6 @@ include::modules/developer-lightspeed/proc-gathering-feedback.adoc[leveloffset=+
1111

1212
include::modules/developer-lightspeed/proc-updating-the-system-prompt.adoc[leveloffset=+1]
1313

14-
include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1]
14+
include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1]
15+
16+
include::modules/developer-lightspeed/proc-changing-your-llm-provider.adoc[leveloffset=+1]
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
:_mod-docs-content-type: PROCEDURE
2+
3+
[id="proc-changing-your-llm-provider_{context}"]
4+
= Changing your LLM provider in {ls-short}
5+
6+
{ls-short} operates on a link:{developer-lightspeed-link}##con-about-bring-your-own-model_appendix-about-user-data-security[_Bring Your Own Model_] approach, meaning you must provide and configure access to your preferred Large Language Model (LLM) provider for the service to function. The Road-Core Service (RCS) acts as an intermediary layer that handles the configuration and setup of these LLM providers.
7+
8+
[IMPORTANT]
9+
====
10+
The LLM provider configuration section includes a mandatory dummy provider block. Due to limitations of Road Core, this dummy provider must remain present when working with Lightspeed. This block is typically marked with comments (# Start: Do not remove this block and # End: Do not remove this block) and must not be removed from the configuration file.
11+
====
12+
13+
.Prerequisites
14+
15+
* The path to the file containing your API token must be accessible by the RCS container, requiring the file to be mounted to the RCS container.
16+
17+
.Procedure
18+
19+
You can define additional LLM providers using one of two methods.
20+
21+
* Recommended: In your Developer Lightspeed plugin configuration (For example, the `lightspeed` section within the `lightspeed-app-config.yaml` file), define the new provider or providers under the `lightspeed.servers` key as shown in the following code:
22+
+
23+
[source,yaml]
24+
----
25+
lightspeed:
26+
servers:
27+
- id: my-new-provider
28+
url: my-new-url
29+
token: my-new-token
30+
----
31+
32+
* Alternatively, you can add new LLM providers by updating the rcsconfig.yaml file.
33+
.. In the `llm_providers` section within your `rcsconfig.yaml` file, add your new provider configuration below the mandatory dummy provider block as shown in the following code:
34+
+
35+
[source,yaml]
36+
----
37+
llm_providers:
38+
# Start: Do not remove this block
39+
- name: dummy
40+
type: openai
41+
url: https://dummy.com
42+
models:
43+
- name: dummymodel
44+
# END: Do not remove this block
45+
- name: my-new-providers
46+
type: openai
47+
url: my-provider-url
48+
credentials_path: path/to/token
49+
disable_model_check: true
50+
----
51+
.. If you need to define a new provider in `rcsconfig.yaml`, you must configure the following critical parameters:
52+
** `credentials_path`: Specifies the path to a `.txt` file that contains your API token. This file must be mounted and accessible by the RCS container.
53+
** `disable_model_check`: Set this field to `true` to allow the RCS to locate models through the `/v1/models` endpoint of the provider. When you set this field to `true`, you avoid the need to define model names explicitly in the configuration.

modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,13 @@ metadata:
2929
data:
3030
rcsconfig.yaml: |
3131
llm_providers:
32+
# Start: Do not remove this block
3233
- name: dummy
3334
type: openai
3435
url: https://dummy.com
3536
models:
3637
- name: dummymodel
38+
# End: Do not remove this block
3739
ols_config:
3840
user_data_collection:
3941
log_level: "DEBUG"
@@ -67,6 +69,11 @@ data:
6769
user_agent: "example-user-agent"
6870
ingress_url: "https://example.ingress.com/upload"
6971
----
72+
+
73+
[IMPORTANT]
74+
====
75+
Do not remove the block in the `llm_providers` section. This requirement is crucial when working with {ls-short} because of limitations discovered in Road Core. For instances where you decide to use an alternative LLM provider, you should refer to additional information in the guide. For more information, see link:{developer-lightspeed-link}#proc-changing-your-llm-provider[Changing your LLM provider].
76+
====
7077
.. Optional: Configure the number of workers that scale the REST API by specifying the following example to the `ols_config.max_workers` parameter in the {rcs-short} ConfigMap.
7178
+
7279
[source,yaml]

0 commit comments

Comments
 (0)