Skip to content
This repository was archived by the owner on Mar 27, 2025. It is now read-only.

Commit e6c5c46

Browse files
authored
Update instructions for Exercises 14, 15, and 16 (#14)
* exercise_014/README.md: Don't give the code for the SSE method * exercise_014/README.md: minor update to explain best way to get product IDs * exercise_015/README.md: Remove mention of docker-compose setup for Kafka * exercise_015/README.md: Remove serialisation/deserialisation config * exercise_016/README.md: Remove mention of Conduktor and replace with Kafka Dev UI
1 parent 9bd1fd5 commit e6c5c46

File tree

3 files changed

+7
-48
lines changed
  • code
    • exercise_014_Internal_Channels
    • exercise_015_Connecting_to_Kafka
    • exercise_016_Dead_Letter_Queue_and_Stream_filtering

3 files changed

+7
-48
lines changed

code/exercise_014_Internal_Channels/README.md

Lines changed: 3 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ We will create a _Generator_ that generates new prices for all our products ever
1010

1111
* Pull in the class `PriceUpdate` by executing this command from the command line: `cmtc pull-template src/main/java/com/lunatech/training/quarkus/PriceUpdate.java <root folder of exercise repo>`. The `PriceUpdate` class represents an updated price for the product with the product id in the class.
1212

13-
* Create the file `PriceUpdateStream.java` with the following template:
13+
* Create the file `PriceUpdateStreams.java` with the following template:
1414

1515
```java
1616
package com.lunatech.training.quarkus;
@@ -37,7 +37,7 @@ public class PriceUpdateStreams {
3737
}
3838
```
3939

40-
* Implement the method `public Multi<PriceUpdate> generate()` on the `PriceUpdateStream` class, and make it return a `Multi` that emits a `PriceUpdate` item for each of the products in our database (You can hardcode it to use product ids 1 to 7) *every five seconds*, using a random price between 0 and 100.
40+
* Implement the method `public Multi<PriceUpdate> generate()` on the `PriceUpdateStreams` class, and make it return a `Multi` that emits a `PriceUpdate` item *every five seconds*, using a random price between 0 and 100, for each of the products in our database (You can hardcode the product IDs - use `curl localhost:8080/products` to discover the product IDs saved in the database) .
4141

4242
Tip, look at the `Multi.createFrom().ticks()` method!
4343
Note that the `print` method has an `@Incoming` annotation that matches the `@Outgoing` from the `generate` method. Running the application should print seven lines to the console every five seconds, each line being a price update for a product. Run the app to try this :)
@@ -68,14 +68,7 @@ Finally, we will create a `PriceUpdatesResource` class, so we can expose the pri
6868
Multi<PriceUpdate> priceUpdates;
6969

7070
* `@Channel` is also a Reactive Messaging annotation, and Quarkus will connect this `Multi` to the 'price-updates' channel. This is an alternative method to receive the items in that channel (different from how we did it with an `@Incoming` annotation on the `print` method!)
71-
* Next, add this method
72-
73-
@GET
74-
@Produces(MediaType.SERVER_SENT_EVENTS)
75-
@RestSseElementType(MediaType.APPLICATION_JSON)
76-
public Multi<PriceUpdate> prices() {
77-
return priceUpdates;
78-
}
71+
* Next, add a method in `PriceUpdatesResource` that will expose the price updates as Server Sent Events. You can use the example in `ListenNotifyResource` from the previous exercise as inspiration
7972

8073
* Now, connect to this endpoint using Curl:
8174

Lines changed: 3 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,7 @@
11
## Exercise 15: Kafka
22

3-
In this exercise, we will connect our price processing components to Kafka. We will add Kafka to our `docker-compose` setup, and connect the reactive messaging components to Kafka using the `smallrye-reactive-messaging-kafka` extension.
3+
In this exercise, we will connect our price processing components to Kafka. For this we will use the `smallrye-reactive-messaging-kafka` extension to connect our reactive messaging components to Kafka. We will then rely on Dev Services Kafka to automatically start a Kafka broker and to automatically configure the application to find the broker.
44

5-
Tip: If something fails, you can use [Conductor](https://conduktor.io) to check what’s going on in Kafka.
6-
7-
* Uncomment the 'zookeeper' and 'kafka' services in the `docker-compose.yml`
8-
* Run `docker-compose up -d`. This will now start Zookeeper and Kafka (next to the still-running Postgres)
95
* Add the `quarkus-smallrye-reactive-messaging-kafka` extension to your `pom.xml`
106
* Pull in the class `PriceUpdateDeserializer` by executing this command from the command line: `cmtc pull-template src/main/java/com/lunatech/training/quarkus/PriceUpdateDeserializer.java <root folder of exercise repo>`.
117
* On the class `PriceUpdateStreams`:
@@ -16,45 +12,15 @@ Tip: If something fails, you can use [Conductor](https://conduktor.io) to check
1612
- Change the channel name in the `@Channel` annotation to `price-updates-in`
1713
* Add the following config:
1814

19-
kafka.bootstrap.servers=127.0.0.1:9092
20-
mp.messaging.outgoing.raw-price-updates-out.connector=smallrye-kafka
2115
mp.messaging.outgoing.raw-price-updates-out.topic=raw-prices
22-
mp.messaging.outgoing.raw-price-updates-out.value.serializer=io.quarkus.kafka.client.serialization.ObjectMapperSerializer
23-
mp.messaging.incoming.raw-price-updates-in.connector=smallrye-kafka
2416
mp.messaging.incoming.raw-price-updates-in.topic=raw-prices
25-
mp.messaging.incoming.raw-price-updates-in.value.deserializer=com.lunatech.training.quarkus.PriceUpdateDeserializer
26-
mp.messaging.outgoing.price-updates-out.connector=smallrye-kafka
2717
mp.messaging.outgoing.price-updates-out.topic=prices
28-
mp.messaging.outgoing.price-updates-out.value.serializer=io.quarkus.kafka.client.serialization.ObjectMapperSerializer
29-
mp.messaging.incoming.price-updates-in.connector=smallrye-kafka
3018
mp.messaging.incoming.price-updates-in.topic=prices
31-
mp.messaging.incoming.price-updates-in.value.deserializer=com.lunatech.training.quarkus.PriceUpdateDeserializer
3219

3320
* Execute the cURL command again from the previous exercise:
3421

3522
curl http://localhost:8080/prices
3623

3724
* You should see price updates streaming by again.
38-
* Check what’s going on in Kafka with Conduktor if you haven’t yet.
39-
* You can check it without Conduktor. There are two cases:
40-
* You have already installed Kafka on your machine and in this case you can type the following commands each in a terminal
41-
* `$ kafka-console-consumer --bootstrap-server localhost:9092 --topic price-updates --from-beginning`
42-
* `$ kafka-console-consumer --bootstrap-server localhost:9092 --topic raw-price-updates --from-beginning`
43-
* `$ kafka-console-consumer --bootstrap-server localhost:9092 --topic price-updates`
44-
* `--from-beginning` allows to display from the beginning
45-
* They do almost the same thing, we listen to different topics: `price-updates` and `raw-price-updates` and we receive something like this:
46-
47-
{"productId":1,"price":77}
48-
{"productId":2,"price":83}
49-
{"productId":3,"price":71}
50-
{"productId":4,"price":84}
51-
{"productId":6,"price":36}
52-
{"productId":7,"price":43}
53-
54-
* You haven't installed Conduktor, Kafka, or Zookeeper on your machine and you certainly don't want to install them: but you have docker since you are using `docker-compose`
55-
* Launch the containers: `docker-compose up -d`
56-
* You retrieve the name of the kafka container to connect to it (or ID): `docker-compose ps`
57-
* You can now connect to the shell of this container and execute commands. This container contains all Kafka configurations: `docker exec -it quarkus-course-kafka sh`
58-
* Now you are connected to the container and you can use the same command lines as above but with a bit difference, you have to mention the executable path : bin files are located here `/opt/kafka/bin/`
59-
* Don’t forget .sh for extension file
60-
* `/opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic price-updates --from-beginning`
25+
* You can check what’s going on in Kafka with the Kafka Dev UI:
26+
* http://localhost:8080/q/dev-ui/io.quarkus.quarkus-kafka-client/topics

code/exercise_016_Dead_Letter_Queue_and_Stream_filtering/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ In this exercise we will see a method to deal with ‘broken’ messages.
1010
mp.messaging.incoming.raw-price-updates-in.failure-strategy=dead-letter-queue
1111
mp.messaging.incoming.raw-price-updates-in.dead-letter-queue.value.serializer=io.quarkus.kafka.client.serialization.ObjectMapperSerializer
1212

13-
* Restart the app, and observe that the stream works again (curl http://localhost:8080/prices), although now most of the times you end up with less than 7 updates per 5 seconds. The failures end up in the topic `dead-letter-topic-raw-price-updates-in`. You can easily inspect it with Conduktor.
13+
* Restart the app, and observe that the stream works again (curl http://localhost:8080/prices), although now most of the times you end up with less than 7 updates per 5 seconds. The failures end up in the topic `dead-letter-topic-raw-price-updates-in`. You can easily inspect it with the Kafka Dev UI - http://localhost:8080/q/dev-ui/io.quarkus.quarkus-kafka-client/topics
1414

1515
Finally, we want to connect our React frontend to the cool new price-streaming feature. But before we do so, we have to make one more endpoint; that only streams prices for an individual product.
1616

0 commit comments

Comments
 (0)