diff --git a/artifacts/attributes.adoc b/artifacts/attributes.adoc index 610db55b76..dc7bb61deb 100644 --- a/artifacts/attributes.adoc +++ b/artifacts/attributes.adoc @@ -115,6 +115,7 @@ :control-access-category-link: {product-docs-link}/#Control access :customizing-book-link: {product-docs-link}/html-single/customizing_red_hat_developer_hub/index :customizing-book-title: Customizing {product} +:developer-lightspeed-link: {product-docs-link}/html-single/interacting_with_red_hat_developer_lightspeed_for_red_hat_developer_hub/index :discover-category-link: {product-docs-link}/#Discover :dynamic-plugins-reference-book-link: {product-docs-link}/html-single/dynamic_plugins_reference/index :dynamic-plugins-reference-book-title: Dynamic plugins reference diff --git a/assemblies/assembly-customizing-developer-lightspeed.adoc b/assemblies/assembly-customizing-developer-lightspeed.adoc index 04cfbb7695..d75dd0cf9a 100644 --- a/assemblies/assembly-customizing-developer-lightspeed.adoc +++ b/assemblies/assembly-customizing-developer-lightspeed.adoc @@ -11,4 +11,6 @@ include::modules/developer-lightspeed/proc-gathering-feedback.adoc[leveloffset=+ include::modules/developer-lightspeed/proc-updating-the-system-prompt.adoc[leveloffset=+1] -include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1] \ No newline at end of file +include::modules/developer-lightspeed/proc-customizing-the-chat-history-storage.adoc[leveloffset=+1] + +include::modules/developer-lightspeed/proc-changing-your-llm-provider.adoc[leveloffset=+1] diff --git a/modules/developer-lightspeed/proc-changing-your-llm-provider.adoc b/modules/developer-lightspeed/proc-changing-your-llm-provider.adoc new file mode 100644 index 0000000000..c22c984e89 --- /dev/null +++ b/modules/developer-lightspeed/proc-changing-your-llm-provider.adoc @@ -0,0 +1,53 @@ +:_mod-docs-content-type: PROCEDURE + +[id="proc-changing-your-llm-provider_{context}"] += Changing your LLM provider in {ls-short} + +{ls-short} operates on a link:{developer-lightspeed-link}##con-about-bring-your-own-model_appendix-about-user-data-security[_Bring Your Own Model_] approach, meaning you must provide and configure access to your preferred Large Language Model (LLM) provider for the service to function. The Road-Core Service (RCS) acts as an intermediary layer that handles the configuration and setup of these LLM providers. + +[IMPORTANT] +==== +The LLM provider configuration section includes a mandatory dummy provider block. Due to limitations of Road Core, this dummy provider must remain present when working with Lightspeed. This block is typically marked with comments (# Start: Do not remove this block and # End: Do not remove this block) and must not be removed from the configuration file. +==== + +.Prerequisites + +* The path to the file containing your API token must be accessible by the RCS container, requiring the file to be mounted to the RCS container. + +.Procedure + +You can define additional LLM providers using one of two methods. + +* Recommended: In your Developer Lightspeed plugin configuration (For example, the `lightspeed` section within the `lightspeed-app-config.yaml` file), define the new provider or providers under the `lightspeed.servers` key as shown in the following code: ++ +[source,yaml] +---- +lightspeed: + servers: + - id: my-new-provider + url: my-new-url + token: my-new-token +---- + +* Alternatively, you can add new LLM providers by updating the rcsconfig.yaml file. +.. In the `llm_providers` section within your `rcsconfig.yaml` file, add your new provider configuration below the mandatory dummy provider block as shown in the following code: ++ +[source,yaml] +---- +llm_providers: + # Start: Do not remove this block + - name: dummy + type: openai + url: https://dummy.com + models: + - name: dummymodel + # END: Do not remove this block + - name: my-new-providers + type: openai + url: my-provider-url + credentials_path: path/to/token + disable_model_check: true +---- +.. If you need to define a new provider in `rcsconfig.yaml`, you must configure the following critical parameters: +** `credentials_path`: Specifies the path to a `.txt` file that contains your API token. This file must be mounted and accessible by the RCS container. +** `disable_model_check`: Set this field to `true` to allow the RCS to locate models through the `/v1/models` endpoint of the provider. When you set this field to `true`, you avoid the need to define model names explicitly in the configuration. \ No newline at end of file diff --git a/modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc b/modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc index da419123a9..7cd2fa35e5 100644 --- a/modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc +++ b/modules/developer-lightspeed/proc-installing-and-configuring-lightspeed.adoc @@ -29,11 +29,13 @@ metadata: data: rcsconfig.yaml: | llm_providers: + # Start: Do not remove this block - name: dummy type: openai url: https://dummy.com models: - name: dummymodel + # End: Do not remove this block ols_config: user_data_collection: log_level: "DEBUG" @@ -67,6 +69,11 @@ data: user_agent: "example-user-agent" ingress_url: "https://example.ingress.com/upload" ---- ++ +[IMPORTANT] +==== +Do not remove the block in the `llm_providers` section. This requirement is crucial when working with {ls-short} because of limitations discovered in Road Core. For instances where you decide to use an alternative LLM provider, you should refer to additional information in the guide. For more information, see link:{developer-lightspeed-link}#proc-changing-your-llm-provider[Changing your LLM provider]. +==== .. Optional: Configure the number of workers that scale the REST API by specifying the following example to the `ols_config.max_workers` parameter in the {rcs-short} ConfigMap. + [source,yaml]