Skip to content

Commit fcd0997

Browse files
committed
OLS-1628: Remove TP language from OLS docs
1 parent a246253 commit fcd0997

File tree

4 files changed

+31
-20
lines changed

4 files changed

+31
-20
lines changed

about/ols-about-openshift-lightspeed.adoc

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,13 @@ The following topics provide an overview of {ols-official} and discuss functiona
1111
include::modules/ols-openshift-lightspeed-overview.adoc[leveloffset=+1]
1212
include::modules/ols-about-product-coverage.adoc[leveloffset=+2]
1313
include::modules/ols-openshift-requirements.adoc[leveloffset=+1]
14-
include::modules/ols-large-language-model-overview.adoc[leveloffset=+1]
14+
15+
[role="_additional-resources"]
16+
.Additional resources
17+
18+
* link:https://docs.redhat.com/en/documentation/openshift_container_platform/4.17/html/support/remote-health-monitoring-with-connected-clusters#about-remote-health-monitoring[About remote health monitoring]
19+
20+
include::modules/ols-large-language-model-requirements.adoc[leveloffset=+1]
1521
//Xavier wanted to remove vLLM until further testing is performed.
1622
//include::modules/ols-about-openshift-ai-vllm.adoc[leveloffset=+2]
1723
include::modules/ols-supported-platforms.adoc[leveloffset=+1]

modules/ols-about-data-use.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,8 @@
55

66
{ols-official} is a virtual assistant you interact with using natural language. Using the {ols-long} interface, you send chat messages that {ols-long} transforms and sends to the Large Language Model (LLM) provider you have configured for your environment. These messages can contain information about your cluster, cluster resources, or other aspects of your environment.
77

8-
The {ols-long} {ols-release} release has limited capabilities to filter or redact the information you provide to the LLM. Do not enter information into the {ols-long} interface that you do not want to send to the LLM provider.
8+
The {ols-long} service has limited capabilities to filter or redact the information you provide to the LLM. Do not enter information into the {ols-long} interface that you do not want to send to the LLM provider.
99

10-
By using the {ols-long} as part of the {ols-release} release, you agree that Red Hat may use all of the messages that you exchange with the LLM provider for any purpose. The transcript recording data uses the Red Hat Insights system’s back-end, and is subject to the same access restrictions and other security policies.
10+
By sending transcripts or feedback to Red{nbsp}Hat you agree that Red{nbsp}Hat can use the data for quality assurance purposes. The transcript recording data uses the back-end of the Red{nbsp}Hat{nbsp}Insights system, and is subject to the same access restrictions and other security policies.
1111

12-
You may email mailto:[email protected][Red Hat] and request that your data be deleted at the end of the {ols-release} release period.
12+
You can email mailto:[email protected][Red Hat] and request that your data be deleted.

modules/ols-large-language-model-overview.adoc renamed to modules/ols-large-language-model-requirements.adoc

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -3,41 +3,38 @@
33
// * about/ols-about-openshift-lightspeed.adoc
44

55
:_mod-docs-content-type: CONCEPT
6-
[id="ols-large-language-model-overview"]
7-
= Large Language Model (LLM) overview
8-
:context: ols-large-language-model-overview
6+
[id="ols-large-language-model-requirements"]
7+
= Large Language Model (LLM) requirements
8+
:context: ols-large-language-model-requirements
99

1010
A large language model (LLM) is a type of machine learning model that can interpret and generate human-like language. When an LLM is used with a virtual assistant the LLM can interpret questions accurately and provide helpful answers in a conversational manner.
1111

12-
As part of the {ols-release} release, {ols-long} can rely on the following Software as a Service (SaaS) LLM providers:
12+
The {ols-long} service must have access to an LLM provider. The service does not provide an LLM for you, so the LLM must be configured prior to installing the {ols-long} Operator.
13+
14+
The {ols-long} service can rely on the following Software as a Service (SaaS) LLM providers:
1315

1416
* OpenAI
1517
1618
* {azure-openai}
1719
1820
* {watsonx}
1921
20-
[NOTE]
21-
====
22-
Many self-hosted or self-managed model servers claim API compatibility with OpenAI. It is possible to configure the {ols-long} OpenAI provider to point to an API-compatible model server. If the model server is truly API-compatible, especially with respect to authentication, then it may work. These configurations have not been tested by Red Hat, and issues related to their use are outside the scope of {ols-release} support.
23-
====
24-
25-
For {ols-long} configurations with {rhoai} or {rhelai}, you must host your own LLM provider rather than use a SaaS LLM provider.
22+
If you want to self-host a model, you can use {rhoai} or {rhelai} as your model provider.
2623

2724
[id="ibm-watsonx_{context}"]
2825
== {watsonx}
2926

30-
To use {watsonx} with {ols-official}, you need an account with link:https://www.ibm.com/products/watsonx-ai[IBM Cloud's watsonx].
27+
To use {watsonx} with {ols-official}, you need an account with link:https://www.ibm.com/products/watsonx-ai[IBM Cloud watsonx]. For more information, see the link:https://dataplatform.cloud.ibm.com/docs/content/wsj/getting-started/welcome-main.html?context=wx[Documentation for IBM watsonx as a Service].
3128

3229
[id="open-ai_{context}"]
3330
== Open AI
3431

35-
To use {openai} with {ols-official}, you need access to the {openai} link:https://openai.com/api/[API platform].
32+
To use {openai} with {ols-official}, you need access to the {openai} link:https://openai.com/api/[API platform]. For more information, see the link:https://platform.openai.com/docs/overview[OpenAI developer platform] documentation.
3633

3734
[id="azure-open-ai_{context}"]
3835
== {azure-openai}
3936

40-
To use {azure-official} with {ols-official}, you need access to {azure-openai}.
37+
To use {azure-official} with {ols-official}, you need access to link:https://azure.microsoft.com/en-us/[{azure-openai}]. For more information, see the link:https://learn.microsoft.com/en-us/azure/ai-services/openai/[Azure OpenAI documentation].
4138

4239
[id="rhelai_{context}"]
4340
== {rhelai}
Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,16 @@
1+
// Module included in the following assemblies:
2+
// * lightspeed-docs-main/about/ols-about-openshift-lightspeed.adoc
3+
14
:_mod-docs-content-type: CONCEPT
2-
[id="ols-openshift-requirements"]
5+
[id="ols-openshift-requirements_context"]
36
= OpenShift Requirements
4-
:context: ols-openshift-requirements
57

68
{ols-long} requires {ocp-product-title} 4.15 or later running on x86 hardware. Any installation type or deployment architecture is supported so long as the cluster is 4.15+ and x86-based.
79

8-
For the {ols-long} {ols-release} release, the cluster you use must be connected to the Internet and it must have telemetry enabled. Telemetry is enabled by default. If you are using a standard installation process for {ocp-short-name} confirm that it does not disable telemetry.
10+
Telemetry is enabled on {ocp-product-title} clusters by default.
11+
12+
* If the cluster has telemetry enabled, the {ols-long} service sends conversations and feedback to Red{nbsp}Hat by default.
13+
14+
* If the cluster has telemetry disabled, the {ols-long} service does not send conversations and feedback to Red{nbsp}Hat.
15+
16+
* If the cluster has telemetry enabled, and you do not want the {ols-long} service to send conversations and feedback to Red{nbsp}Hat, you must disable telemetry.

0 commit comments

Comments
 (0)