You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/registry/quarkus-registry/README.adoc
+33-36Lines changed: 33 additions & 36 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -133,7 +133,6 @@ For this quick start, you'll use the Quarkus {registry} sample code from the App
133
133
.Procedure
134
134
. On the command line, clone the App Services {samples-git-repo}[Guides and Samples^] repository from GitHub.
135
135
+
136
-
.Cloning the guides and samples repository
137
136
[source,subs="+attributes"]
138
137
----
139
138
git clone {samples-git-repo} app-services-guides
@@ -151,26 +150,26 @@ endif::[]
151
150
[role="_abstract"]
152
151
To enable your Quarkus applications to access a Kafka instance, configure the connection properties using the Kafka bootstrap server endpoint. To access a {registry} instance, configure the registry endpoint connection property with the Core Registry API value.
153
152
154
-
Access to the {registry} and Kafka instances is managed using the same service account and SASL/OAUTHBEARER token endpoint. For Quarkus, you can configure all connection properties using the `application.properties` file. This example sets environment variables and references them in this file.
153
+
Access to the {registry} and Kafka instances is managed using the same service account and SASL/OAUTHBEARER token endpoint. For Quarkus, you can configure all connection properties using the `application.properties` file. The example in this task sets environment variables and then references them in the `application.properties` file.
155
154
156
155
Quarkus applications use https://github.com/eclipse/microprofile-reactive-messaging[MicroProfile Reactive Messaging^] to produce messages to and consume messages from your Kafka instances in {product-kafka}. For details on configuration options, see the https://quarkus.io/guides/kafka[Apache Kafka Reference Guide^] in the Quarkus documentation.
157
156
158
157
This Quarkus example application includes producer and consumer processes that serialize/deserialize Kafka messages using a schema stored in {registry}.
159
158
160
159
.Prerequisites
161
160
* You have a service account with write access to Kafka and {registry} instances and have stored your credentials securely (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^] and {base-url}{getting-started-url-registry}[Getting started with {product-long-registry}^]).
162
-
* You have the Kafka bootstrap server endpoint for the Kafka instance. You copied this information previously for the Kafka instance in {product-kafka} by selecting the options menu (three vertical dots) and clicking *Connection*.
163
-
* You have the Core Registry API endpoint for the {registry} instance. You copied this information for the {registry} instance by selecting the options menu (three vertical dots) and clicking *Connection*. From the list of endpoints, you copied the *Core Registry API* endpoint supported by the Apicurio serializer/deserializer (SerDes) used in this example.
164
-
* You copied the *Token endpoint URL* value from the same list of endpoints to be used for the OAuth-based athentication method used in this example.
161
+
* You have the bootstrap server endpoint for the Kafka instance. To get this information, select your Kafka instance in the {service-url-kafka}[{product-kafka} web console^], select the options icon (three vertical dots), and click *Connection*.
162
+
* You have the Core Registry API endpoint for the {registry} instance. To get this information, select your {registry} instance in the {service-url-registry}[{product-registry} web console^], select the options icon (three vertical dots) and click *Connection*. Copy the *Core Registry API* endpoint supported by the Apicurio serializer/deserializer (SerDes) used in this example.
163
+
* You have the SASL/OAUTHBEARER token endpoint used by the {registry} and Kafka instances. To get this information, select your {registry} instance in the {service-url-registry}[{product-registry} web console^], select the options icon (three vertical dots) and click *Connection*. Copy the *Token endpoint URL* value.
165
164
166
165
.Procedure
167
-
. On the command line, set the following environment variables to use your Kafka and {registry} instances with Quarkus or other applications. Replace the values with your own server and credential information:
166
+
. On the command line, set the following environment variables to use your Kafka and {registry} instances with Quarkus or other applications. Replace values in angle brackets (`< >`) with your own server and credential information.
168
167
+
169
-
* The `<bootstrap_server>` is the bootstrap server endpoint for your Kafka instance.
170
-
* The `<service_registry_url>` is the URL for your {registry} instance.
171
-
* The `<service_registry_core_path>` is the constant we use to for setting proper API path for service registry
172
-
* The `<oauth_token_endpoint_uri>` is the SASL/OAUTHBEARER token endpoint.
173
-
* The `<client_id>` and `<client_secret>` are the generated credentials for your service account.
168
+
* The `<bootstrap_server>` value is the bootstrap server endpoint for your Kafka instance.
169
+
* The `<service_registry_url>` value is the URL for your {registry} instance.
170
+
* The `SERVICE_REGISTRY_CORE_PATH` variable is a constant value used to set the API path for {product-registry}.
171
+
* The `<oauth_token_endpoint_uri>` value is the SASL/OAUTHBEARER token endpoint.
172
+
* The `<client_id>` and `<client_secret>` values are the generated credentials for your service account.
174
173
+
175
174
.Setting environment variables for server and credentials
. In the Quarkus example application, review the `/src/main/resources/application.properties` files in the `consumer` and `producer` sub-folders to understand how the environment variables you set in the previous step are used. This example uses the `dev` configuration profile in the `application.properties` files.
185
+
. In the Quarkus example application, review the `/src/main/resources/application.properties` files in the `consumer` and `producer` subfolders to understand how the environment variables you set in the previous step are used. This example uses the `dev` configuration profile in the `application.properties` files.
187
186
188
187
ifdef::qs[]
189
188
.Verification
@@ -194,35 +193,33 @@ endif::[]
194
193
== Creating the quotes Kafka topic in {product-kafka}
195
194
196
195
[role="_abstract"]
197
-
For this quick start, the Kafka topic that the Quarkus example application uses is called `quotes`. You must create this topic in {product-kafka} so that the Quarkus application can interact with it.
196
+
The Quarkus application in this quick start uses a Kafka topic called `quotes` to produce and consume messages. In this task, you'll create the `quotes` topic in your Kafka instance.
198
197
199
198
.Prerequisites
200
-
* You're logged in to the {product-kafka} web console at {service-url-kafka}[^].
201
-
* You have a running Kafka instance with at least one Kafka topic in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
199
+
* You have a running Kafka instance in {product-long-kafka}.
202
200
203
201
.Procedure
204
-
. In the {product-kafka} web console, go to *Streams for Apache Kafka* > *Kafka Instances* and click the name of the Kafka instance that you want to add a topic to.
205
-
. Select the *Topics* tab, click *Create topic*, and follow the guided steps to define the topic details. Click *Next* to complete each step and click *Finish* to complete the setup.
202
+
. In the {product-kafka} {service-url-kafka}[web console^], select *Kafka Instances* and then click the name of the Kafka instance that you want to add a topic to.
203
+
. Select the *Topics* tab.
204
+
. Click *Create topic* and follow the guided steps to define the topic details.
206
205
+
207
-
[.screencapture]
208
-
.Guided steps to define topic details
209
-
image::sak-create-topic.png[Image of wizard to create a topic]
210
-
211
-
* *Topic name*: Enter `quotes` as the topic name.
212
-
* *Partitions*: Set the number of partitions for this topic. This example sets the partition to `1` for a single partition. Partitions are distinct lists of messages in a topic and enable parts of a topic to be distributed over multiple brokers in the cluster. A topic can contain one or more partitions, enabling producer and consumer loads to be scaled.
213
-
* *Message retention*: Set the message retention time and size to the relevant value and increment. This example sets the retention time to `A week` and the retention size to `Unlimited`. Message retention time is the amount of time that messages are retained in a topic before they are deleted or compacted, depending on the cleanup policy. Retention size is the maximum total size of all log segments in a partition before they are deleted or compacted.
214
-
* *Replicas*: For this release of {product-kafka}, the replicas are preconfigured. The number of partition replicas for the topic is set to `3` and the minimum number of follower replicas that must be in sync with a partition leader is set to `2`.
215
-
+
216
-
Replicas are copies of partitions in a topic. Partition replicas are distributed over multiple brokers in the cluster to ensure topic availability if a broker fails. When a follower replica is in sync with a partition leader, the follower replica can become the new partition leader if needed.
217
-
+
218
-
After you complete the topic setup, the new Kafka topic is listed in the topics table. You can now run the Quarkus application to start producing and consuming messages using this topic.
206
+
--
207
+
You must specify the following topic properties:
208
+
209
+
* *Topic name*: For this quick start, enter `quotes` as the topic name.
210
+
* *Partitions*: Set the number of partitions for the topic. For this quick start, set the value to `1`.
211
+
* *Message retention*: Set the message retention time and size. For this quick start, set the retention time to `A week` and the retention size to `Unlimited`.
212
+
* *Replicas*: For this release of {product-kafka}, the replica values are preconfigured. The number of partition replicas for the topic is set to `3` and the minimum number of follower replicas that must be in sync with a partition leader is set to `2`. For a trial Kafka instance, the number of replicas and the minimum in-sync replica factor are both set to `1`.
213
+
214
+
After you complete the setup, the new topic appears on the *Topics* page. You can now run the Quarkus application to start producing and consuming messages to and from this topic.
215
+
--
219
216
220
217
.Verification
221
218
ifdef::qs[]
222
-
* Is the new `quotes` Kafka topic listed in the topics table?
219
+
* Is the `quotes` topic listed on the *Topics* page?
223
220
endif::[]
224
221
ifndef::qs[]
225
-
* Verify that the new `quotes` Kafka topic is listed in the topics table.
222
+
* Verify that the `quotes` topic is listed on the *Topics* page.
226
223
endif::[]
227
224
228
225
@@ -232,13 +229,13 @@ endif::[]
232
229
[role="_abstract"]
233
230
After you configure your Quarkus application to connect to Kafka and {registry} instances, and you create the Kafka topic, you can run the Quarkus application to start producing and consuming messages to and from this topic.
234
231
235
-
The Quarkus application in this quick start consists of two processes:
232
+
The Quarkus application in this quick start consists of the following processes:
236
233
237
-
* The consumer process is implemented by the `QuotesResource` class. This class exposes the `/quotes` REST endpoint that streams quotes from the `quotes` topic. This process also has a minimal frontend that streams quotes using Server-Sent Events to the web page.
238
-
* The producer process is implemented by the `QuotesProducer` class. This class produces a new quote periodically (every 5 seconds) with a random quote value that is published to the `quotes` topic.
234
+
* A consumer process that is implemented by the `QuotesResource` class. This class exposes the `/quotes` REST endpoint that streams quotes from the `quotes` topic. This process also has a minimal frontend that uses Server-Sent Events to stream the quotes to a web page.
235
+
* A producer process that is implemented by the `QuotesProducer` class. This class produces a new quote periodically (every 5 seconds) with a random value that is published to the `quotes` topic.
239
236
240
237
.Prerequisites
241
-
* You've configured the Quarkus example application to connect to the Kafka and {registry} instances.
238
+
* You've configured the Quarkus example application to connect to your Kafka and {registry} instances.
242
239
* You've created the Kafka `quotes` topic.
243
240
ifndef::qs[]
244
241
* You're logged in to the {registry} web console at {service-url-registry}[^].
@@ -256,7 +253,7 @@ $ mvn quarkus:dev
256
253
----
257
254
. After the consumer process is running, in a web browser, go to http://localhost:8080/quotes.html[^] and verify that this process is available.
258
255
259
-
. Leave the consumer process running, and run the producer process on a different terminal.
256
+
. Leave the consumer process running, and run the producer process in a different terminal.
0 commit comments