Skip to content

Commit eff6ff7

Browse files
author
Prabha Kylasamiyer Sundara Rajan
committed
MTA-5378 - LLM Configurations for Developer Lightspeed
1 parent 0de6fce commit eff6ff7

20 files changed

+757
-0
lines changed
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
:_newdoc-version: 2.18.3
2+
:_template-generated: 2025-04-08
3+
4+
ifdef::context[:parent-context-of-configuring-openshift-ai: {context}]
5+
6+
:_mod-docs-content-type: ASSEMBLY
7+
8+
ifndef::context[]
9+
[id="configuring-openshift-ai"]
10+
endif::[]
11+
ifdef::context[]
12+
[id="configuring-openshift-ai_{context}"]
13+
endif::[]
14+
= Configuring {ocp-short} AI
15+
:context: configuring-openshift-ai
16+
17+
abc
18+
19+
include::topics/developer-lightspeed/proc_creating-datascience-cluster.adoc[leveloffset=+1]
20+
21+
include::topics/developer-lightspeed/proc_configuring-llm-serving-runtime.adoc[leveloffset=+1]
22+
23+
include::topics/developer-lightspeed/proc_creating-accelerator-profile.adoc[leveloffset=+1]
24+
25+
26+
ifdef::parent-context-of-configuring-openshift-ai[:context: {parent-context-of-configuring-openshift-ai}]
27+
ifndef::parent-context-of-configuring-openshift-ai[:!context:]
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
:_newdoc-version: 2.18.3
2+
:_template-generated: 2025-04-08
3+
4+
ifdef::context[:parent-context-of-configuring-llm: {context}]
5+
6+
:_mod-docs-content-type: ASSEMBLY
7+
8+
ifndef::context[]
9+
[id="configuring-llm"]
10+
endif::[]
11+
ifdef::context[]
12+
[id="configuring-llm_{context}"]
13+
endif::[]
14+
= Configuring large language models for analysis
15+
:context: configuring-llm
16+
17+
{mta-dl-plugin} works with large language models (LLM) run in different environments to support analyzing Java applications in a wide range of scenarios. You can choose an LLM from well-known providers, local models that you run from Ollama or Podman desktop, and OpenAI API compatible models that are available as Model-as-a-Service deployments.
18+
19+
The result of an analysis performed by {mta-dl-plugin} depends on the parameter configuration of the LLM that you choose. In order to use {mta-dl-plugin} for analysis, you must deploy your LLM and then, configure mandatory settings (for example, API key and secret) and other parameters for your LLM.
20+
21+
You can run an LLM from the following providers:
22+
23+
* OpenAI
24+
* Azure OpenAI
25+
* Google Gemini
26+
* Amazon Bedrock
27+
* Deepseek
28+
* OpenShift AI
29+
30+
include::topics/developer-lightspeed/con_model-as-a-service.adoc[leveloffset=+1]
31+
32+
include::assembly_maas-oc-install-config.adoc[leveloffset=+2]
33+
34+
include::assembly_configuring-openshift-ai.adoc[leveloffset=+2]
35+
36+
include::topics/developer-lightspeed/proc_configuring-llm-podman-desktop.adoc[leveloffset=+1]
37+
38+
ifdef::parent-context-of-configuring-llm[:context: {parent-context-of-configuring-llm}]
39+
ifndef::parent-context-of-configuring-llm[:!context:]
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
:_newdoc-version: 2.18.3
2+
:_template-generated: 2025-04-08
3+
4+
ifdef::context[:parent-context-of-maas-oc-install-config: {context}]
5+
6+
:_mod-docs-content-type: ASSEMBLY
7+
8+
ifndef::context[]
9+
[id="maas-oc-install-config"]
10+
endif::[]
11+
ifdef::context[]
12+
[id="maas-oc-install-config_{context}"]
13+
endif::[]
14+
= Installing and configuring {ocp-short} cluster
15+
:context: maas-oc-install-config
16+
17+
abc
18+
19+
include::topics/developer-lightspeed/proc_install-oc-cluster.adoc[leveloffset=+1]
20+
21+
include::topics/developer-lightspeed/proc_creating-identity-provider.adoc[leveloffset=+1]
22+
23+
include::topics/developer-lightspeed/proc_configuring-operators.adoc[leveloffset=+1]
24+
25+
include::topics/developer-lightspeed/proc_creating-gpu-machine-set.adoc[leveloffset=+1]
26+
27+
include::topics/developer-lightspeed/proc_configuring-node-auto-scaling.adoc[leveloffset=+1]
28+
29+
ifdef::parent-context-of-maas-oc-install-config[:context: {parent-context-of-maas-oc-install-config}]
30+
ifndef::parent-context-of-maas-oc-install-config[:!context:]
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../../topics/
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../../assemblies/
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
<title>Developer Lightspeed Guide</title>
2+
<productname>{DocInfoProductName}</productname>
3+
<productnumber>{DocInfoProductNumber}</productnumber>
4+
<subtitle>Using the {ProductName} Developer Lightspeed to modernize your applications</subtitle>
5+
<abstract>
6+
<para>you can use {ProductFullName} Developer Lightspeed for application modernization in your organization by running Artificial Intelligence-driven static code analysis for Java applications.</para>
7+
</abstract>
8+
<authorgroup>
9+
<orgname>Red Hat Customer Content Services</orgname>
10+
</authorgroup>
11+
<xi:include href="Common_Content/Legal_Notice.xml" xmlns:xi="http://www.w3.org/2001/XInclude" />
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
:mta:
2+
include::topics/templates/document-attributes.adoc[]
3+
:_mod-docs-content-type: ASSEMBLY
4+
[id="mta-developer-lightspeed"]
5+
= MTA Developer Lightspeed Guide
6+
7+
:toc:
8+
:toclevels: 4
9+
:numbered:
10+
:imagesdir: topics/images
11+
:context: mta-developer-lightspeed
12+
:mta-developer-lightspeed:
13+
14+
//Inclusive language statement
15+
include::topics/making-open-source-more-inclusive.adoc[]
16+
17+
18+
19+
20+
21+
22+
23+
24+
25+
26+
include::assemblies/developer-lightspeed-guide/assembly_configuring_llm.adoc[leveloffset=+1]
27+
28+
:!mta-developer-lightspeed:
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../topics/
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
:_newdoc-version: 2.15.0
2+
:_template-generated: 2024-2-21
3+
4+
:_mod-docs-content-type: CONCEPT
5+
6+
[id="model-as-a-service_{context}"]
7+
= Deploying an LLM as a scalable service
8+
9+
[role="_abstract"]
10+
{mta-dl-plugin} also supports large language models (LLMs) that are deployed as a scalable service on {ocp-full} clusters. These deployments, called model-as-a-service (MaaS), provide you with greater control to optimize resources such as compute, cluster nodes, and auto-scaling Graphical Processing Units (GPUs) while enabling you to leverage artificial intelligence to perform operations at a large scale.
11+
12+
13+
The workflow for configuring an LLM on {ocp-short} AI can be broadly divided into the following parts:
14+
15+
* Installing and configuring resources: from creating an {ocp} cluster to configuring node auto scaling
16+
* Configuring OpenShift AI: from creating a data science project to creating an accelerator profile
17+
* Deploying the LLM: from uploading a model to deploying the model
18+
* Preparing the LLM for analysis: from downloading the CA certificates to updating the `provider.settings` file.
19+
* Configuring monitoring and alerting for the storage resource: creating a ConfigMap for monitoring storage and an alerting configuration file.

0 commit comments

Comments
 (0)