Skip to content

Commit f9d7131

Browse files
committed
Create new AI Java
1 parent 073d40d commit f9d7131

File tree

2 files changed

+277
-0
lines changed

2 files changed

+277
-0
lines changed
Lines changed: 277 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,277 @@
1+
////
2+
This document is maintained in the main Quarkus repository
3+
and pull requests should be submitted there:
4+
https://github.com/quarkusio/quarkus/tree/main/docs/src/main/asciidoc
5+
////
6+
[id="getting-started-ai-java-apps"]
7+
= Creating your first AI Java application using LangChain4j
8+
include::_attributes.adoc[]
9+
:quickstart-includes: https://github.com/lauracowen/quarkus-quickstarts/getting-started-ai-java-apps/main
10+
:diataxis-type: tutorial
11+
:categories: getting-started, ai
12+
:extension-status: experimental
13+
:topics: getting-started, ai
14+
////
15+
The document header ends at the first blank line. Do not remove the blank line between the header and the abstract summary.
16+
////
17+
18+
Create a simple RESTful Java application that asks an AI model to write a poem based on a topic provided by the user at an endpoint. The service responds to GET requests made to the `http://localhost:8080/poems/{topic}/{lines}` URL. The user enters the topic and length of the desired poem; for example `http://localhost:8080/poems/purple/5`, to generate a poem such as:
19+
20+
----
21+
In twilight's embrace, a royal hue,
22+
Lavender whispers in the morning dew.
23+
Amethyst dreams weave through the night,
24+
A canvas of wonder, both bold and bright.
25+
Nature's soft secret, a majestic view.
26+
----
27+
28+
You will create a Java class and a Java interface. The class represents a resource which defines the application's endpoint and calls the AI model by using the interface to implement an AI service. The AI service uses the parameter values passed in through the endpoint to compose a prompt, a natural language text request, and sends it to the AI model.
29+
30+
The AI model (a large language model, or LLM) parses the prompt and returns a poem according to the topic and number of lines requested in the prompt. LLMs are AI models that are trained to generate output based on the natural language requests they receive. The input and output of LLMs are usually text, as in this application, but some LLMs specialise in other formats such as images or video.
31+
32+
image::getting-started-ai-java-apps.png[alt=Architecture, align=center]
33+
34+
35+
Much of the work needed to compose the prompt and to connect to the LLM in order to send the request and get a response is handled for you by LangChain4j, an open source library that is available as an extension to Quarkus.
36+
37+
38+
39+
// TODO: If this is a tutorial for an experimental or tech-preview extension, uncomment the following (otherwise delete)
40+
include::{includes}/extension-status.adoc[]
41+
42+
43+
.Prerequisites
44+
45+
:prerequisites-time: 15 minutes
46+
:prerequisites-no-graalvm:
47+
include::{includes}/prerequisites.adoc[]
48+
49+
50+
For the simple application in this guide, we can use a demo key for OpenAI.
51+
52+
== Solution
53+
54+
Follow the instructions in this guide to create the application from scratch.
55+
56+
However, you can first run the completed example to see how it works:
57+
58+
. Download an {quickstarts-archive-url}[archive] or clone the git repository:
59+
+
60+
[source,bash,subs=attributes+]
61+
----
62+
git clone {quickstarts-clone-url}
63+
----
64+
+
65+
. Run the application in dev mode:
66+
+
67+
Quarkus CLI:
68+
+
69+
[source,text]
70+
----
71+
cd getting-started-ai-java-apps
72+
quarkus dev
73+
----
74+
+
75+
Maven:
76+
+
77+
[source,text]
78+
----
79+
cd getting-started-ai-java-apps
80+
./mvnw quarkus:dev
81+
----
82+
+
83+
. Try the application by visiting the endpoint; for example: `http://localhost:8080/poems/purple/5`.
84+
85+
//The solution is located in the `getting-started-ai-java-apps` link:{quickstarts-tree-url}/getting-started-ai-java-apps[directory].
86+
87+
In the terminal, press `q` to stop Quarkus before you continue with this guide.
88+
89+
:sectnums:
90+
:sectnumlevels: 1
91+
== Creating the project
92+
93+
The easiest way to create a new Quarkus project is to open a terminal and run the following command:
94+
95+
:create-app-artifact-id: getting-started-ai-java-apps
96+
:create-app-extensions: rest
97+
:create-app-code:
98+
include::{includes}/devtools/create-app.adoc[]
99+
100+
It generates the following items in `./getting-started-ai-java-apps`:
101+
102+
* the Maven structure
103+
* an `org.acme.GreetingResource` resource exposed on `/hello`
104+
* an associated unit test
105+
* a landing page that is accessible on `http://localhost:8080` after starting the application
106+
* example `Dockerfile` files for both `native` and `jvm` modes in `src/main/docker`
107+
* the `application.properties` configuration file
108+
109+
Start the application in dev mode by running the following commands:
110+
111+
Quarkus CLI:
112+
[source,text]
113+
----
114+
cd getting-started-ai-java-apps
115+
quarkus dev
116+
----
117+
118+
Maven:
119+
[source,text]
120+
----
121+
cd getting-started-ai-java-apps
122+
./mvnw quarkus:dev
123+
----
124+
125+
To check that everything is working, visit `http://localhost:8080`. The Quarkus welcome page lists the endpoints available in the generated sample application (`/hello`).
126+
127+
Stay in dev mode so that as you add files to the project, your changes are automatically compiled and deployed by Quarkus for easy testing.
128+
129+
130+
[[ai-service]]
131+
== Creating the AI service
132+
133+
The AI service provides an abstraction layer to make it easier to write a Java application that interacts with an LLM.
134+
The AI service composes the prompts to send to the LLM and receives the responses from the LLM.
135+
The application needs only minimal configuration to connect to the LLM because the AI service handles the connection details.
136+
AI services can manage other information too, including chat memory and toolboxes, which will be explored in other guides.
137+
138+
The AI service composes the prompt from two pieces of information in the class that implements the AI service:
139+
140+
- the system message, which provides context to the request that the application sends to the LLM. For example, you can set the role or persona of the LLM and guide the LLM's behavior when responding.
141+
- the user message, which represents the user's latest input to send to the LLM. The user message is usually received, and processed, by the LLM after the system message.
142+
143+
Create the `AiPoemService` interface at `src/main/java/org/acme/AiPoemService.java`:
144+
145+
[source,java,linenums]
146+
----
147+
package org.acme;
148+
149+
import dev.langchain4j.service.SystemMessage;
150+
import dev.langchain4j.service.UserMessage;
151+
import io.quarkiverse.langchain4j.RegisterAiService;
152+
153+
@RegisterAiService( ) // <1>
154+
public interface AiPoemService {
155+
156+
@SystemMessage("You are a professional poet. Display the poem in well-formed HTML with line breaks (no markdown).") // <2>
157+
@UserMessage("Write a poem about {poemTopic}. The poem should be {poemLines} lines long.") // <3>
158+
String writeAPoem(String poemTopic, int poemLines); // <4>
159+
}
160+
161+
----
162+
<1> Implements the interface as an AI service which can connect to the LLM that is configured in the `resources/application.properties` file.
163+
<2> Instructs the LLM to take the role of a professional poet and to display the generated poem in well-formed HTML with line breaks so that it renders neatly when viewed in a web browser.
164+
<3> Asks the LLM to generate a poem on the topic and of the length that the user has chosen.
165+
The user's choices are passed as parameters from the endpoint, `{topic}` and `{lines}`, to complete the templated user message placeholders, `{poemTopic}` and `{poemLines}`.
166+
<4> Starts an exchange between the application and the AI service.
167+
The AI service composes a prompt including the system message and the user message and sends it to the LLM.
168+
The `writeAPoem()` method is called by the `showMeAPoem()` method in the `Poems` class, passing in the user's chosen topic and length from the endpoint parameters.
169+
170+
[[rest-resource]]
171+
== Creating the RESTful resource
172+
173+
The RESTful resource defines the endpoint of your RESTful service. When a GET request is made to the endpoint, the `showMeAPoem()` method runs and calls the `writeAPoem()` method in the AI service to send a request to the LLM.
174+
175+
In this application, the resource class defines the endpoint that receives the user's input (choice of topic and number of lines in the poem) and then passes it to the AI service to include in its request to the LLM.
176+
177+
Create the `Poems` class at `src/main/java/org/acme/Poems.java`:
178+
179+
[source,java,linenums]
180+
----
181+
package org.acme;
182+
183+
import jakarta.inject.Inject;
184+
import jakarta.ws.rs.GET;
185+
import jakarta.ws.rs.Path;
186+
import jakarta.ws.rs.PathParam;
187+
import jakarta.ws.rs.Produces;
188+
import jakarta.ws.rs.core.MediaType;
189+
190+
@Path("/poems")
191+
public class Poems {
192+
193+
@Inject
194+
AiPoemService aiPoemService; // <1>
195+
196+
@GET
197+
@Produces(MediaType.TEXT_HTML)
198+
@Path("/{topic}/{lines}") // <2>
199+
public String showMeAPoem(@PathParam("topic") String userTopic, @PathParam("lines") int userLines) { // <3>
200+
return aiPoemService.writeAPoem(userTopic, userLines); // <4>
201+
}
202+
}
203+
----
204+
<1> Implements the `AiPoemService` interface as `aiPoemService`.
205+
<2> Defines the RESTful endpoint that takes the user's input as endpoint parameters, `{topic}` and `{lines}`.
206+
<3> Declares the `showMeAPoem()` method which takes two arguments, `userTopic` and `userLines` (because there is more than one argument, you must explicitly annotate each parameter).
207+
When a GET request is made to the `/poems/{topic}/{lines}` endpoint, the values of the `{topic}` and `{lines}` parameters are passed as the `writeAPoem()` method's `userTopic` and `userLines` arguments.
208+
<4> Calls the AI service's `writeAPoem()` method with the values received from the endpoint.
209+
Calling the `writeAPoem()` method causes these values to be added to the user message as part of the prompt that the AI service sends to the LLM.
210+
The response from the LLM is then displayed in HTML...hopefully.
211+
212+
[[configure]]
213+
== Configuring the application
214+
215+
Connecting to an LLM is greatly simplified by using LangChain4j.
216+
For this application, Quarkus uses the link:https://quarkus.io/extensions/io.quarkiverse.langchain4j/quarkus-langchain4j-openai/[Quarkus LangChain4j OpenAI extension], which is configured in the `pom.xml`.
217+
You then need only set the API key and the base URL properties for the LLM in `src/main/resources/application.properties`; in this case, you can use the value `demo` to get limited demo access to the LLM which is sufficient for this application:
218+
219+
[source,linenums]
220+
----
221+
quarkus.langchain4j.openai.api-key=demo
222+
quarkus.langchain4j.openai.base-url=http://langchain4j.dev/demo/openai/v1
223+
----
224+
225+
[[run]]
226+
== Running the application
227+
228+
If dev mode is not already running, start the application in dev mode now:
229+
230+
Quarkus CLI:
231+
[source,text]
232+
----
233+
quarkus dev
234+
----
235+
236+
Maven:
237+
[source,text]
238+
----
239+
./mvnw quarkus:dev
240+
----
241+
242+
Any changes you make to the application are automatically rebuilt and re-deployed while running in dev mode.
243+
244+
To test the application, request the endpoint with the values you choose replacing the template placeholders.
245+
For example, request a poem of five lines about purple with the URI `http://localhost:8080/poems/purple/5`.
246+
The HTML request in the system message means that it should display neatly in a web browser.
247+
248+
Alternatively, run the following curl command:
249+
250+
----
251+
curl -w "\n" http://localhost:8080/poems/purple/5
252+
----
253+
254+
Notice that there is a slight pause while the LLM responds, but then the application returns a short poem on the chosen topic and of the requested length.
255+
256+
[[prompts]]
257+
== Try alternative prompts without modifying your code
258+
259+
The Quarkus Dev UI (when running in dev mode) provides a chat interface where you can test alternative user messages without modifying your application code.
260+
To use the Dev UI chat interface:
261+
262+
. From the running terminal, press `d` to open the Quarkus Dev UI Extensions page (`http://localhost:8080/q/dev-ui/extensions`) in a browser.
263+
The Extensions page lists all the extensions installed in your running instance of Quarkus.
264+
. In the **LangChain4j Core** tile, click **Chat** to open the Chat interface.
265+
. The **System message** field contains the system message from your application. You can modify the system message if you want to.
266+
. In the **Message** field, type a user message then press **Send**.
267+
268+
The application runs and returns a response based on the system and user messages entered in the chat window.
269+
270+
271+
272+
:sectnums!:
273+
== Where next?
274+
275+
Congratulations! You have created your first AI Java application and run it on Quarkus with LangChain4j.
276+
277+
Next, if you want to learn more about writing AI Java applications with Quarkus and LangChain4j, take a look at the link:https://redhat-developer-demos.github.io/quarkus-tutorial/quarkus-tutorial/17_ai_intro.html[Quarkus and AI tutorial].
168 KB
Loading

0 commit comments

Comments
 (0)