Skip to content

wjglerum/quarkus-ai-agent-workshop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

65 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Building secure AI agents with Quarkus LangChain4j

In this workshop we will build a simple secure AI agent with Quarkus and LangChain4j. Next, we will explore how to securely integrate it into an application and enable monitoring and logging for production. The workshop is divided into multiple parts, you can start with the first part and build your way through the workshop:

  • Chatbot & Tools (WebSocket UI, REST clients, function-calling)
  • MCP (Model Context Protocol) integration
  • RAG (Retrieval-Augmented Generation) over your own docs
  • Guardrails (input/output validation)
  • Testing (fast, deterministic guardrail tests; optional scoring)
  • Observability for production

Each step is self-contained. If you get stuck, you can always check out the solution in the next part.

Prerequisites

Make sure you have the following installed locally:

Setup

Note

This workshop needs to download a lot of dependencies, so it might take a while on the first run. If you have an unlimited data plan, you can speed up the process by using your mobile connection instead of the shared wireless connection.

Initial clone

First clone the repository and run the build:

./mvnw install -DskipTests

Enable LLM

To run this workshop you need access to an LLM. You can either use a local model or one of the other providers that you have access to.

Ollama

Ollama is a free tool that can be used to run models locally. You can explore some popular models from Ollama, there are a lot of great free models available:

We will use llama3.2 for now. Feel free to experiment with other models too, do watch the download size though! Larger models will take longer to download, so you might want to use a smaller model for the workshop. Also to run large models you will need to have plenty of free disk space and memory available.

You can start a model directly from the app or from the command line:

ollama run llama3.2

Note

Running models locally is great for development, but it is restricted by the specifications of your machine. You can also use a CPU only model, however the performance will be much lower and expect slow responses.

OpenAI

Note

If you have the option, you can also leverage OpenAI so you don't need to run a model locally. Unfortunately, the free tier has been discontinued, so you will need to use a paid plan. Some costs apply, but this workshop should only cost you a few cents/dollars. This is not required for the workshop, but feel free to explore.

You can generate an API key on your profile. This is only possible if you have a payment method set up. Next you can run the following command to export the API key:

export OPENAI_API_KEY=<YOUR_API_KEY_HERE>

Warning

Make sure to keep the API key secret. You are responsible for the costs yourself. You can disable auto recharge to avoid surcharges.

Gemini

If you have a Google account you can also leverage the free tier from Gemini. You can generate an API key on your profile. No payment method is required. Check out the documentation for more information. Next you can run the following command to export the API key:

export GEMINI_API_KEY=<YOUR_API_KEY_HERE>

Getting started

Before diving in, open the Step 0 – Setup folder.
This is your personal workspace — all your code for the workshop will live there.

Further reading

Quarkus is a great framework for writing AI agents and tools. If you want to learn more, you can check out the following resources:

Acknowledgements

This workshop was inspired by the existing Quarkus LangChain4j Workshop and uses examples from the Quarkus website, documentation and blog posts. If you find any issues or have suggestions, please open an issue or a PR.

About

Building secure AI agents with Quarkus LangChain4j

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors