|
3 | 3 | // * about/ols-about-openshift-lightspeed.adoc
|
4 | 4 |
|
5 | 5 | :_mod-docs-content-type: CONCEPT
|
6 |
| -[id="ols-large-language-model-overview"] |
7 |
| -= Large Language Model (LLM) overview |
8 |
| -:context: ols-large-language-model-overview |
| 6 | +[id="ols-large-language-model-requirements"] |
| 7 | += Large Language Model (LLM) requirements |
| 8 | +:context: ols-large-language-model-requirements |
9 | 9 |
|
10 | 10 | A large language model (LLM) is a type of machine learning model that can interpret and generate human-like language. When an LLM is used with a virtual assistant the LLM can interpret questions accurately and provide helpful answers in a conversational manner.
|
11 | 11 |
|
12 |
| -As part of the {ols-release} release, {ols-long} can rely on the following Software as a Service (SaaS) LLM providers: |
| 12 | +The {ols-long} service must have access to an LLM provider. The service does not provide an LLM for you, so the LLM must be configured prior to installing the {ols-long} Operator. |
| 13 | + |
| 14 | +The {ols-long} service can rely on the following Software as a Service (SaaS) LLM providers: |
13 | 15 |
|
14 | 16 | * OpenAI
|
15 | 17 |
|
16 | 18 | * {azure-openai}
|
17 | 19 |
|
18 | 20 | * {watsonx}
|
19 | 21 |
|
20 |
| -[NOTE] |
21 |
| -==== |
22 |
| -Many self-hosted or self-managed model servers claim API compatibility with OpenAI. It is possible to configure the {ols-long} OpenAI provider to point to an API-compatible model server. If the model server is truly API-compatible, especially with respect to authentication, then it may work. These configurations have not been tested by Red Hat, and issues related to their use are outside the scope of {ols-release} support. |
23 |
| -==== |
24 |
| - |
25 |
| -For {ols-long} configurations with {rhoai} or {rhelai}, you must host your own LLM provider rather than use a SaaS LLM provider. |
| 22 | +If you want to self-host a model, you can use {rhoai} or {rhelai} as your model provider. |
26 | 23 |
|
27 | 24 | [id="ibm-watsonx_{context}"]
|
28 | 25 | == {watsonx}
|
29 | 26 |
|
30 |
| -To use {watsonx} with {ols-official}, you need an account with link:https://www.ibm.com/products/watsonx-ai[IBM Cloud's watsonx]. |
| 27 | +To use {watsonx} with {ols-official}, you need an account with link:https://www.ibm.com/products/watsonx-ai[IBM Cloud watsonx]. For more information, see the link:https://dataplatform.cloud.ibm.com/docs/content/wsj/getting-started/welcome-main.html?context=wx[Documentation for IBM watsonx as a Service]. |
31 | 28 |
|
32 | 29 | [id="open-ai_{context}"]
|
33 | 30 | == Open AI
|
34 | 31 |
|
35 |
| -To use {openai} with {ols-official}, you need access to the {openai} link:https://openai.com/api/[API platform]. |
| 32 | +To use {openai} with {ols-official}, you need access to the {openai} link:https://openai.com/api/[API platform]. For more information, see the link:https://platform.openai.com/docs/overview[OpenAI developer platform] documentation. |
36 | 33 |
|
37 | 34 | [id="azure-open-ai_{context}"]
|
38 | 35 | == {azure-openai}
|
39 | 36 |
|
40 |
| -To use {azure-official} with {ols-official}, you need access to {azure-openai}. |
| 37 | +To use {azure-official} with {ols-official}, you need access to link:https://azure.microsoft.com/en-us/[{azure-openai}]. For more information, see the link:https://learn.microsoft.com/en-us/azure/ai-services/openai/[Azure OpenAI documentation]. |
41 | 38 |
|
42 | 39 | [id="rhelai_{context}"]
|
43 | 40 | == {rhelai}
|
|
0 commit comments