Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions artifacts/attributes.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,7 @@
:customizing-book-link: {product-docs-link}/html-single/customizing_red_hat_developer_hub/index
:customizing-book-title: Customizing {product}
:default-helm-chart-values-link: link:https://github.com/redhat-developer/rhdh-chart/blob/release-{product-version}/charts/backstage/values.yaml
:developer-lightspeed-link: {product-docs-link}/html-single/interacting_with_red_hat_developer_lightspeed_for_red_hat_developer_hub/index
:discover-category-link: {product-docs-link}/#Discover
:dynamic-plugins-default-yaml-link: link:https://github.com/redhat-developer/rhdh/blob/release-{product-version}/dynamic-plugins.default.yaml
:dynamic-plugins-reference-book-link: {product-docs-link}/html-single/dynamic_plugins_reference/index
Expand Down
2 changes: 2 additions & 0 deletions assemblies/assembly-customizing-developer-lightspeed.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,5 @@ include::modules/developer-lightspeed/proc-gathering-feedback.adoc[leveloffset=+
include::modules/developer-lightspeed/proc-updating-the-system-prompt.adoc[leveloffset=+1]

include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1]

include::modules/developer-lightspeed/proc-changing-your-llm-provider.adoc[leveloffset=+1]
73 changes: 73 additions & 0 deletions modules/developer-lightspeed/proc-changing-your-llm-provider.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
:_mod-docs-content-type: PROCEDURE

[id="proc-changing-your-llm-provider_{context}"]
= Changing your LLM provider in {ls-short}

{ls-short} operates on a link:{developer-lightspeed-link}##con-about-bring-your-own-model_appendix-about-user-data-security[_Bring Your Own Model_] approach, meaning you must provide and configure access to your preferred Large Language Model (LLM) provider for the service to function. The Road-Core Service (RCS) acts as an intermediary layer that handles the configuration and setup of these LLM providers.

[IMPORTANT]
====
The LLM provider configuration section includes a mandatory dummy provider block. Due to limitations of Road Core, this dummy provider must remain present when working with Lightspeed. This block is typically marked with comments (# Start: Do not remove this block and # End: Do not remove this block) and must not be removed from the configuration file.
====

.Prerequisites

* The path to the file containing your API token must be accessible by the RCS container, requiring the file to be mounted to the RCS container.

.Procedure

You can define additional LLM providers using either of following methods:

* Recommended: In your Developer Lightspeed plugin configuration (the `lightspeed` section within the `lightspeed-app-config.yaml` file), define the new provider or providers under the `lightspeed.servers` key as shown in the following code:
+
[source,yaml]
----
lightspeed:
servers:
- id: _<my_new_provider>_
url: _<my_new_url>_
token: _<my_new_token>_
----
** Alternatively, you can set the `id`, `url`, and `token` values in a Kubernetes Secret and reference them using environment variables in your application configuration as shown in the following code:
+
[source,yaml]
----
env:
- name: _<my_new_url>_
valueFrom:
secretKeyRef:
name: my-secret
key: _<my_new_url>_
- name: _<my_new_provider>_
valueFrom:
secretKeyRef:
name: my-secret
key: _<my_new_provider>_
- name: _<my_new_token>_
valueFrom:
secretKeyRef:
name: my-secret
key: _<my_new_token>_
----
* You can add new LLM providers by updating the `rcsconfig.yaml` file.
.. In the `llm_providers` section within your `rcsconfig.yaml` file, add your new provider configuration below the mandatory dummy provider block as shown in the following code:
+
[source,yaml]
----
llm_providers:
# Start: Do not remove this block
- name: dummy
type: openai
url: https://dummy.com
models:
- name: dummymodel
# END: Do not remove this block
- name: _<my_new_providers>_
type: openai
url: _<my_provider_url>_
credentials_path: path/to/token
disable_model_check: true
----
.. If you need to define a new provider in `rcsconfig.yaml`, you must configure the following critical parameters:
** `credentials_path`: Specifies the path to a `.txt` file that contains your API token. This file must be mounted and accessible by the RCS container.
** `disable_model_check`: Set this field to `true` to allow the RCS to locate models through the `/v1/models` endpoint of the provider. When you set this field to `true`, you avoid the need to define model names explicitly in the configuration.
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,13 @@ metadata:
data:
rcsconfig.yaml: |
llm_providers:
# Start: Do not remove this block
- name: dummy
type: openai
url: https://dummy.com
models:
- name: dummymodel
# End: Do not remove this block
ols_config:
user_data_collection:
log_level: "DEBUG"
Expand Down Expand Up @@ -66,6 +68,11 @@ data:
user_agent: "example-user-agent"
ingress_url: "https://example.ingress.com/upload"
----
+
[IMPORTANT]
====
Do not remove the block in the `llm_providers` section. This requirement is crucial when working with {ls-short} because of limitations discovered in Road Core. For instances where you decide to use an alternative LLM provider, you should refer to additional information in the guide. For more information, see link:{developer-lightspeed-link}#proc-changing-your-llm-provider[Changing your LLM provider].
====
.. Optional: Configure the number of workers that scale the REST API by specifying the following example to the `ols_config.max_workers` parameter in the {rcs-short} ConfigMap.
+
[source,yaml]
Expand Down