You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 27, 2025. It is now read-only.
Update instructions for Exercises 14, 15, and 16 (#14)
* exercise_014/README.md: Don't give the code for the SSE method
* exercise_014/README.md: minor update to explain best way to get product IDs
* exercise_015/README.md: Remove mention of docker-compose setup for Kafka
* exercise_015/README.md: Remove serialisation/deserialisation config
* exercise_016/README.md: Remove mention of Conduktor and replace with Kafka Dev UI
Copy file name to clipboardExpand all lines: code/exercise_014_Internal_Channels/README.md
+3-10Lines changed: 3 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ We will create a _Generator_ that generates new prices for all our products ever
10
10
11
11
* Pull in the class `PriceUpdate` by executing this command from the command line: `cmtc pull-template src/main/java/com/lunatech/training/quarkus/PriceUpdate.java <root folder of exercise repo>`. The `PriceUpdate` class represents an updated price for the product with the product id in the class.
12
12
13
-
* Create the file `PriceUpdateStream.java` with the following template:
13
+
* Create the file `PriceUpdateStreams.java` with the following template:
14
14
15
15
```java
16
16
packagecom.lunatech.training.quarkus;
@@ -37,7 +37,7 @@ public class PriceUpdateStreams {
37
37
}
38
38
```
39
39
40
-
* Implement the method `public Multi<PriceUpdate> generate()` on the `PriceUpdateStream` class, and make it return a `Multi` that emits a `PriceUpdate` item for each of the products in our database (You can hardcode it to use product ids 1 to 7) *every five seconds*, using a random price between 0 and 100.
40
+
* Implement the method `public Multi<PriceUpdate> generate()` on the `PriceUpdateStreams` class, and make it return a `Multi` that emits a `PriceUpdate` item *every five seconds*, using a random price between 0 and 100, for each of the products in our database (You can hardcode the product IDs - use `curl localhost:8080/products` to discover the product IDs saved in the database) .
41
41
42
42
Tip, look at the `Multi.createFrom().ticks()` method!
43
43
Note that the `print` method has an `@Incoming` annotation that matches the `@Outgoing` from the `generate` method. Running the application should print seven lines to the console every five seconds, each line being a price update for a product. Run the app to try this :)
@@ -68,14 +68,7 @@ Finally, we will create a `PriceUpdatesResource` class, so we can expose the pri
68
68
Multi<PriceUpdate> priceUpdates;
69
69
70
70
*`@Channel` is also a Reactive Messaging annotation, and Quarkus will connect this `Multi` to the 'price-updates' channel. This is an alternative method to receive the items in that channel (different from how we did it with an `@Incoming` annotation on the `print` method!)
71
-
* Next, add this method
72
-
73
-
@GET
74
-
@Produces(MediaType.SERVER_SENT_EVENTS)
75
-
@RestSseElementType(MediaType.APPLICATION_JSON)
76
-
public Multi<PriceUpdate> prices() {
77
-
return priceUpdates;
78
-
}
71
+
* Next, add a method in `PriceUpdatesResource` that will expose the price updates as Server Sent Events. You can use the example in `ListenNotifyResource` from the previous exercise as inspiration
In this exercise, we will connect our price processing components to Kafka. We will add Kafka to our `docker-compose` setup, and connect the reactive messaging components to Kafka using the `smallrye-reactive-messaging-kafka` extension.
3
+
In this exercise, we will connect our price processing components to Kafka. For this we will use the `smallrye-reactive-messaging-kafka` extension to connect our reactive messaging components to Kafka. We will then rely on Dev Services Kafka to automatically start a Kafka broker and to automatically configure the application to find the broker.
4
4
5
-
Tip: If something fails, you can use [Conductor](https://conduktor.io) to check what’s going on in Kafka.
6
-
7
-
* Uncomment the 'zookeeper' and 'kafka' services in the `docker-compose.yml`
8
-
* Run `docker-compose up -d`. This will now start Zookeeper and Kafka (next to the still-running Postgres)
9
5
* Add the `quarkus-smallrye-reactive-messaging-kafka` extension to your `pom.xml`
10
6
* Pull in the class `PriceUpdateDeserializer` by executing this command from the command line: `cmtc pull-template src/main/java/com/lunatech/training/quarkus/PriceUpdateDeserializer.java <root folder of exercise repo>`.
11
7
* On the class `PriceUpdateStreams`:
@@ -16,45 +12,15 @@ Tip: If something fails, you can use [Conductor](https://conduktor.io) to check
16
12
- Change the channel name in the `@Channel` annotation to `price-updates-in`
*`--from-beginning` allows to display from the beginning
45
-
* They do almost the same thing, we listen to different topics: `price-updates` and `raw-price-updates` and we receive something like this:
46
-
47
-
{"productId":1,"price":77}
48
-
{"productId":2,"price":83}
49
-
{"productId":3,"price":71}
50
-
{"productId":4,"price":84}
51
-
{"productId":6,"price":36}
52
-
{"productId":7,"price":43}
53
-
54
-
* You haven't installed Conduktor, Kafka, or Zookeeper on your machine and you certainly don't want to install them: but you have docker since you are using `docker-compose`
55
-
* Launch the containers: `docker-compose up -d`
56
-
* You retrieve the name of the kafka container to connect to it (or ID): `docker-compose ps`
57
-
* You can now connect to the shell of this container and execute commands. This container contains all Kafka configurations: `docker exec -it quarkus-course-kafka sh`
58
-
* Now you are connected to the container and you can use the same command lines as above but with a bit difference, you have to mention the executable path : bin files are located here `/opt/kafka/bin/`
* Restart the app, and observe that the stream works again (curl http://localhost:8080/prices), although now most of the times you end up with less than 7 updates per 5 seconds. The failures end up in the topic `dead-letter-topic-raw-price-updates-in`. You can easily inspect it with Conduktor.
13
+
* Restart the app, and observe that the stream works again (curl http://localhost:8080/prices), although now most of the times you end up with less than 7 updates per 5 seconds. The failures end up in the topic `dead-letter-topic-raw-price-updates-in`. You can easily inspect it with the Kafka Dev UI - http://localhost:8080/q/dev-ui/io.quarkus.quarkus-kafka-client/topics
14
14
15
15
Finally, we want to connect our React frontend to the cool new price-streaming feature. But before we do so, we have to make one more endpoint; that only streams prices for an individual product.
0 commit comments