Skip to content

Commit de0b8b8

Browse files
authored
Merge pull request #170 from redhat-developer-demos/quarkus-ai-update
Maintenance of some of the Quarkus AI stuff
2 parents c83e3d3 + 013de37 commit de0b8b8

File tree

7 files changed

+81
-24
lines changed

7 files changed

+81
-24
lines changed

documentation/modules/ROOT/nav.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@
2323
** xref:16_kafka-and-streams.adoc[Apache Kafka with Reactive Streams]
2424
2525
* AI
26+
** xref:17_ai_intro.adoc[AI with Quarkus]
2627
** xref:17_prompts.adoc[Working with prompts]
2728
** xref:18_chains_memory.adoc[Chains and Memory]
2829
** xref:19_agents_tools.adoc[Agents/Tools]
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
= Quarkus and AI
2+
3+
AI is becoming an intrinsic part of software development. It can help us write, test and debug code. We can also infuse AI models directly into our applications. Inversely, we can also create functions/tools that can be called by AI agents to augment their capabilities and knowledge.
4+
5+
Quarkus supports a few different ways to work with AI, mainly leveraging the LangChain4j extension. There are also other extensions such as the Quarkus MCP server which allows you to serve tools to be consumed by AI agents.
6+
7+
In this chapter, we'll explore how to work with AI models. We'll cover:
8+
* Prompting AI models in your applications
9+
* Preserving state between calls
10+
* Creating Tools for use by AI Agents
11+
* Embedding Documents that can be queried by LLMs
12+
* Building a chatbot
13+
* Working with local models (using Podman Desktop AI Lab)
14+

documentation/modules/ROOT/pages/17_prompts.adoc

Lines changed: 16 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,17 @@
44

55
The Quarkus LangChain4j extension seamlessly integrates Large Language Models (LLMs) into Quarkus applications. LLMs are AI-based systems designed to understand, generate, and manipulate human language, showcasing advanced natural language processing capabilities. Thanks to this extension, we can enable the harnessing of LLM capabilities for the development of more intelligent applications.
66

7-
In this first chapter, we'll explore the simplest of interactions with an LLM: Prompting. It essentially means just asking questions to an LLM and receiving an answer in natural language from a given Model, such as OpenAI, Mistral, Hugging Face, Ollama, etc.
7+
In this first chapter, we'll explore the simplest of interactions with an LLM: Prompting. It essentially means just asking questions to an LLM and receiving an answer in natural language from a given model, such as ChatGPT, Granite, Mistral, etc.
88

99

1010
== Creating a Quarkus & LangChain4j Application
1111

12+
We're going to use the langchain4j-openai extension for our first interaction with models.
13+
The openai extension supports models that expose the open sourced OpenAI API specification.
14+
Several models and model providers expose this API specification. If you want to use
15+
a different API spec, then you can likely find a supported extension in the https://docs.quarkiverse.io/quarkus-langchain4j/dev/llms.html[Quarkus documentation].
16+
17+
1218
[tabs%sync]
1319
====
1420
@@ -18,7 +24,7 @@ Maven::
1824
[.console-input]
1925
[source,bash,subs="+macros,+attributes"]
2026
----
21-
mvn "io.quarkus.platform:quarkus-maven-plugin:create" -DprojectGroupId="com.redhat.developers" -DprojectArtifactId="{project-ai-name}" -DprojectVersion="1.0-SNAPSHOT" -Dextensions=rest,langchain4j-core,langchain4j-openai
27+
mvn "io.quarkus.platform:quarkus-maven-plugin:create" -DprojectGroupId="com.redhat.developers" -DprojectArtifactId="{project-ai-name}" -DprojectVersion="1.0-SNAPSHOT" -Dextensions=rest,langchain4j-openai
2228
cd {project-ai-name}
2329
----
2430
--
@@ -29,7 +35,7 @@ Quarkus CLI::
2935
[.console-input]
3036
[source,bash,subs="+macros,+attributes"]
3137
----
32-
quarkus create app -x rest -x langchain4j-openai -x langchain4j-core com.redhat.developers:{project-ai-name}:1.0-SNAPSHOT
38+
quarkus create app -x rest -x langchain4j-openai com.redhat.developers:{project-ai-name}:1.0-SNAPSHOT
3339
cd {project-ai-name}
3440
----
3541
--
@@ -44,9 +50,13 @@ LangChain4j provides you a proxy to connect your application to OpenAI by just a
4450
[.console-input]
4551
[source,properties]
4652
----
53+
# Free demo key for basic usage of OpenAI ChatGPT
4754
quarkus.langchain4j.openai.api-key=demo
55+
# Change this URL to the model provider of your choice
56+
quarkus.langchain4j.openai.base-url=https://api.openai.com/v1
4857
----
4958

59+
5060
== Create the AI service
5161

5262
First we need to create an interface for our AI service.
@@ -68,7 +78,7 @@ public interface Assistant {
6878

6979
== Create the prompt-base resource
7080

71-
Now we're going to implement a resource that send prompts using the AI service.
81+
Now we're going to implement a resource that sends prompts using the AI service.
7282

7383
Create a new `ExistentialQuestionResource` Java class in `src/main/java` in the `com.redhat.developers` package with the following contents:
7484

@@ -136,7 +146,8 @@ You can also run the following command:
136146
curl -w '\n' localhost:8080/earth/flat
137147
----
138148

139-
An example of output (it can vary on each prompt execution):
149+
An example of the output you might see (Yours will likely be slightly different
150+
depending on the response from the non-deterministic LLM):
140151

141152
[.console-output]
142153
[source,text]

documentation/modules/ROOT/pages/18_chains_memory.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ In this section, we'll cover how we can achieve this with the LangChain4j extens
88

99
== Create an AI service with memory
1010

11-
Let's create an interface for our AI service, but with memory feature this time.
11+
Let's create an interface for our AI service, but with memory this time.
1212

1313
Create a new `AssistantWithMemory` Java interface in `src/main/java` in the `com.redhat.developers` package with the following contents:
1414

@@ -255,4 +255,4 @@ The result will be at your Quarkus terminal. An example of output (it can vary o
255255
------------------------------------------
256256
----
257257

258-
NOTE: Take a close look at the IDs of our calls to the assistant. Do you notice that the last question was in fact directed to Klaus with ID=1? We were indeed able to maintain 2 separate and concurrent conversations with the LLM!
258+
NOTE: Take a close look at the IDs of our calls to the assistant. Do you notice that the last question was in fact directed to Klaus with ID=1? We were indeed able to maintain 2 separate and concurrent conversations with the LLM.

documentation/modules/ROOT/pages/19_agents_tools.adoc

Lines changed: 34 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ You can read more about this in the https://docs.quarkiverse.io/quarkus-langchai
1212

1313
== Add the Mailer and Mailpit extensions
1414

15-
Open a new terminal window, and make sure you’re at the root of your `{project-ai-name}` project, then run:
15+
Open a new terminal window, and make sure you’re at the root of your `{project-ai-name}` project, then run the following command to add emailing capabilities to our application:
1616

1717
[tabs]
1818
====
@@ -79,7 +79,8 @@ public class EmailService {
7979
Let's create an interface for our AI service, but with `SystemMessage` and `UserMessage` this time.
8080
`SystemMessage` gives context to the AI Model.
8181
In this case, we tell it that it should craft a message as if it is written by a professional poet.
82-
The `UserMessage` is the actual instruction/question we're sending to the AI model. As you can see in the example below,
82+
The `UserMessage` is the actual instruction/question we're sending to the AI model.
83+
As you can see in the example below,
8384
you can format and parameterize the `UserMessage`, translating structured content to text and vice-versa.
8485

8586
Create a new `AssistantWithContext` Java interface in `src/main/java` in the `com.redhat.developers` package with the following contents:
@@ -143,21 +144,41 @@ public class EmailMeAPoemResource {
143144
}
144145
----
145146

146-
== Adding email service properties to your configuration
147+
== Modify application.properties to use the email Tools
148+
149+
Tool calling is not supported with the OpenAI `demo` key so we will need to
150+
either use a real API key, or use a local model that supports tools..
151+
If you want to use OpenAI's ChatGPT, you can create and fund an account at https://platform.openai.com/[OpenAI] and then set the openai-api-key to your key.
152+
153+
We will use a local (free) open source model served with Ollama instead.
154+
To do this, you will need to https://ollama.com/download[download and install Ollama].
155+
Once that's done, you will need to https://ollama.com/search?c=tools[download a model that supports tool calling], such as `granite3.1-dense:2b`. To do so, execute the command:
156+
157+
[#quarkuspdb-dl-ollama]
158+
[.console-input]
159+
[source,config,subs="+macros,+attributes"]
160+
----
161+
ollama pull granite3.1-dense:2b
162+
----
147163

148164
Update the following properties in your `application.properties`
149165

150-
IMPORTANT: The LangChain4j `demo` key currently does not support tools, so you will need to use a real OpenAI key for the email service to be called by the OpenAI model.
151-
You can create an account over at https://platform.openai.com/[OpenAI] if you'd like to see this in action.
152-
Note that OpenAI requires you to fund your account with credits to be able to use the API. The minimum is $5 but this amount will go a long way to test the scenarios in this tutorial.
166+
NOTE: If you do not want to go through the trouble of creating an OpenAI account or install Ollama, you can still test the below scenario, it just won't send an email since the "Tool" functionality unfortunately won't work.
153167

154-
NOTE: If you do not want to create an OpenAI key, you can still test the below scenario, it just won't send an email since the "Tool" functionality unfortunately won't work.
168+
Modify the application.properties as below:
155169

156170
[#quarkuspdb-update-props]
157171
[.console-input]
158172
[source,config,subs="+macros,+attributes"]
159173
----
160-
quarkus.langchain4j.openai.api-key=<YOUR OPENAI KEY>
174+
# Set OpenAI key if you want to use the API key
175+
# quarkus.langchain4j.openai.api-key=demo
176+
177+
# With Ollama
178+
quarkus.langchain4j.openai.base-url=http://localhost:11434/v1
179+
# Configure server to use a specific model
180+
quarkus.langchain4j.openai.chat-model.model-name=granite3.1-dense:2b
181+
quarkus.langchain4j.openai.embedding-model.model-name=granite3.1-dense:2b
161182
162183
quarkus.langchain4j.openai.log-requests=true
163184
quarkus.langchain4j.openai.log-responses=true
@@ -166,7 +187,9 @@ quarkus.langchain4j.openai.timeout=60s
166187
%dev.quarkus.mailer.mock=false
167188
----
168189

169-
Because we haven't configured the local email service, Quarkus will use Dev Services to instantiate and configure a local email service for you (in dev mode only!).
190+
Make sure your Quarkus Dev mode is still running. It should have reloaded with the new configuration.
191+
192+
Because we haven't configured the local email service, Quarkus will also have started a Dev Service to instantiate and configure a local email service for you (in dev mode only!).
170193

171194
You can check it running:
172195

@@ -200,15 +223,15 @@ You can also run the following command:
200223
curl localhost:8080/email-me-a-poem
201224
----
202225

203-
An example of output (it can vary on each prompt execution):
226+
An example of output (will vary on each prompt execution):
204227

205228
[.console-output]
206229
[source,text]
207230
----
208231
I have composed a poem about Quarkus. I have sent it to you via email. Let me know if you need anything else
209232
----
210233

211-
If you have a valid OpenAI key configured, you can check the "real" email:
234+
If you have a tool calling model configured, you can check your inbox for the actual email:
212235

213236
First, open the http://localhost:8080/q/dev-ui[DevUI, window=_blank] and click on the Mailpit arrow.
214237

documentation/modules/ROOT/pages/20_embed_documents.adoc

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
1-
= Embedding Documents
1+
= Embedding Documents and Creating a Chatbot
2+
:description: Learn how to embed documents and create a chatbot using LangChain4J in Quarkus.
23

34
:project-ai-name: quarkus-langchain4j-app
45

@@ -44,7 +45,14 @@ Add the following properties to your `application.properties` so that it looks l
4445
[.console-input]
4546
[source,config,subs="+macros,+attributes"]
4647
----
47-
quarkus.langchain4j.openai.api-key=<YOUR OPENAI KEY>
48+
# Set OpenAI key if you want to use the API key
49+
# quarkus.langchain4j.openai.api-key=demo
50+
51+
# With Ollama
52+
quarkus.langchain4j.openai.base-url=http://localhost:11434/v1
53+
# Configure server to use a specific model
54+
quarkus.langchain4j.openai.chat-model.model-name=granite3.1-dense:2b
55+
quarkus.langchain4j.openai.embedding-model.model-name=granite3.1-dense:2b
4856
4957
quarkus.langchain4j.openai.log-requests=true
5058
quarkus.langchain4j.openai.log-responses=true
@@ -71,7 +79,7 @@ quarkus.langchain4j.openai.chat-model.model-name=gpt-4o #<4>
7179

7280
== Embedding the business document
7381

74-
NOTE: If you don't provide an actual OpenAI key you will still be able to go through this exercise but the "Tools" functions won't be called, resulting in unexpected answers.
82+
NOTE: If you don't provide a model that supports embeddings and tools you will still be able to go through this exercise but the "Tools" functions won't be called, resulting in unexpected answers. See the previous "Agents and Tools" chapter for more information.
7583

7684
Let's provide a document containing the service's terms of use:
7785

documentation/modules/ROOT/pages/21_podman_ai.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22

33
:project-podman-ai-name: quarkus-podman-ai-app
44

5-
Throughout this tutorial, we've been working with OpenAI's remote models, however wouldn't it be nice if we could work
6-
with models on our local machine (without incurring costs)?
5+
Throughout this tutorial, we've been working with OpenAI's remote models, or Ollama's models on our local machine, however wouldn't it be nice if we could work
6+
with models on our local machine (without incurring costs) AND have a nice visualization of what's going on?
77

88
Podman Desktop is a GUI tool that helps with running and managing containers on our local machine, but it can also help with running AI models locally as well thanks to its AI Lab extension. Thanks to Quarkus and LangChain4j, it then becomes trivial to start developing with these models. Let's find out how!
99

1010

1111
== Installing Podman Desktop AI
1212

13-
First, if you haven't yet, you must download and install Podman Desktop on your operating system. https://podman-desktop.io/downloads[The instructions can be found here, window="_blank"].
13+
First, if you haven't yet, download and install Podman Desktop on your operating system. https://podman-desktop.io/downloads[The instructions can be found here, window="_blank"].
1414

1515
NOTE: For Windows/macOS users, if you can, give the Podman machine at least 8GB of memory and 4 CPUs (Generative AI Models are resource hungry!). The model can run with less resources, but it will be significantly slower.
1616

0 commit comments

Comments
 (0)