Skip to content

Commit 06faa95

Browse files
committed
Included content for adding a new llm provider
1 parent 2e2c92c commit 06faa95

File tree

4 files changed

+63
-1
lines changed

4 files changed

+63
-1
lines changed

artifacts/attributes.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -116,6 +116,7 @@
116116
:customizing-book-link: {product-docs-link}/html-single/customizing_red_hat_developer_hub/index
117117
:customizing-book-title: Customizing {product}
118118
:default-helm-chart-values-link: link:https://github.com/redhat-developer/rhdh-chart/blob/release-{product-version}/charts/backstage/values.yaml
119+
:developer-lightspeed-link: {product-docs-link}/html-single/interacting_with_red_hat_developer_lightspeed_for_red_hat_developer_hub/index
119120
:discover-category-link: {product-docs-link}/#Discover
120121
:dynamic-plugins-default-yaml-link: link:https://github.com/redhat-developer/rhdh/blob/release-{product-version}/dynamic-plugins.default.yaml
121122
:dynamic-plugins-reference-book-link: {product-docs-link}/html-single/dynamic_plugins_reference/index

assemblies/assembly-customizing-developer-lightspeed.adoc

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,3 +13,5 @@ include::modules/developer-lightspeed/proc-gathering-feedback.adoc[leveloffset=+
1313
include::modules/developer-lightspeed/proc-updating-the-system-prompt.adoc[leveloffset=+1]
1414

1515
include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1]
16+
17+
include::modules/developer-lightspeed/proc-changing-your-llm-provider.adoc[leveloffset=+1]
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
:_mod-docs-content-type: PROCEDURE
2+
3+
[id="proc-changing-your-llm-provider_{context}"]
4+
= Changing your LLM provider in {ls-short}
5+
6+
{ls-short} operates on a link:{developer-lightspeed-link}##con-about-bring-your-own-model_appendix-about-user-data-security[_Bring Your Own Model_] approach, meaning you must provide and configure access to your preferred Large Language Model (LLM) provider for the service to function. The Road-Core Service (RCS) acts as an intermediary layer that handles the configuration and setup of these LLM providers.
7+
8+
[IMPORTANT]
9+
====
10+
The LLM provider configuration section includes a mandatory dummy provider block. Due to limitations of Road Core, this dummy provider must remain present when working with Lightspeed. This block is typically marked with comments (# Start: Do not remove this block and # End: Do not remove this block) and must not be removed from the configuration file.
11+
====
12+
13+
.Prerequisites
14+
15+
* The path to the file containing your API token must be accessible by the RCS container, requiring the file to be mounted to the RCS container.
16+
17+
.Procedure
18+
19+
You can define additional LLM providers using one of two methods.
20+
21+
* Recommended: In your Developer Lightspeed plugin configuration (For example, the `lightspeed` section within the `lightspeed-app-config.yaml` file), define the new provider or providers under the `lightspeed.servers` key as shown in the following code:
22+
+
23+
[source,yaml]
24+
----
25+
lightspeed:
26+
servers:
27+
- id: my-new-provider
28+
url: my-new-url
29+
token: my-new-token
30+
----
31+
32+
* Alternatively, you can add new LLM providers by updating the rcsconfig.yaml file.
33+
.. In the `llm_providers` section within your `rcsconfig.yaml` file, add your new provider configuration below the mandatory dummy provider block as shown in the following code:
34+
+
35+
[source,yaml]
36+
----
37+
llm_providers:
38+
# Start: Do not remove this block
39+
- name: dummy
40+
type: openai
41+
url: https://dummy.com
42+
models:
43+
- name: dummymodel
44+
# END: Do not remove this block
45+
- name: my-new-providers
46+
type: openai
47+
url: my-provider-url
48+
credentials_path: path/to/token
49+
disable_model_check: true
50+
----
51+
.. If you need to define a new provider in `rcsconfig.yaml`, you must configure the following critical parameters:
52+
** `credentials_path`: Specifies the path to a `.txt` file that contains your API token. This file must be mounted and accessible by the RCS container.
53+
** `disable_model_check`: Set this field to `true` to allow the RCS to locate models through the `/v1/models` endpoint of the provider. When you set this field to `true`, you avoid the need to define model names explicitly in the configuration.

modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,12 +28,13 @@ metadata:
2828
data:
2929
rcsconfig.yaml: |
3030
llm_providers:
31+
# Start: Do not remove this block
3132
- name: dummy
3233
type: openai
33-
# Do not remove the following url link even if you are not using the custom llm.
3434
url: https://dummy.com
3535
models:
3636
- name: dummymodel
37+
# End: Do not remove this block
3738
ols_config:
3839
user_data_collection:
3940
log_level: "DEBUG"
@@ -67,6 +68,11 @@ data:
6768
user_agent: "example-user-agent"
6869
ingress_url: "https://example.ingress.com/upload"
6970
----
71+
+
72+
[IMPORTANT]
73+
====
74+
Do not remove the block in the `llm_providers` section. This requirement is crucial when working with {ls-short} because of limitations discovered in Road Core. For instances where you decide to use an alternative LLM provider, you should refer to additional information in the guide. For more information, see link:{developer-lightspeed-link}#proc-changing-your-llm-provider[Changing your LLM provider].
75+
====
7076
.. Optional: Configure the number of workers that scale the REST API by specifying the following example to the `ols_config.max_workers` parameter in the {rcs-short} ConfigMap.
7177
+
7278
[source,yaml]

0 commit comments

Comments
 (0)