Skip to content

Commit 37a7dc1

Browse files
Docs: Style updates for Quarkus and Node.js quick starts (#559)
* docs: style updates for quarkus and nodejs quickstarts * docs: address peer review feedback * docs: one more peer review update
1 parent 7edcd5f commit 37a7dc1

File tree

2 files changed

+46
-48
lines changed

2 files changed

+46
-48
lines changed

docs/kafka/nodejs-kafka/README.adoc

Lines changed: 21 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@ ifndef::community[]
9393
endif::[]
9494
* You have a Kafka instance in {product-kafka} and the instance is in the *Ready* state. To learn how to create a Kafka instance, see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}].
9595
* https://github.com/git-guides/[Git^] is installed.
96-
* You have an IDE such as https://www.jetbrains.com/idea/download/[IntelliJ IDEA^], https://www.eclipse.org/downloads/[Eclipse^], or https://code.visualstudio.com/Download[VSCode^].
96+
* You have an IDE such as https://www.jetbrains.com/idea/download/[IntelliJ IDEA^], https://www.eclipse.org/downloads/[Eclipse^], or https://code.visualstudio.com/Download[Visual Studio Code^].
9797
* https://nodejs.org/en/download/[Node.js 14^] is installed. The https://github.com/blizzard/node-rdkafka[node-rdkafka^] client can't run on later versions.
9898

9999
NOTE: The example Node.js application in this quick start uses the https://kafka.js.org/[KafkaJS^] client by default. If you want to use the https://github.com/blizzard/node-rdkafka[node-rdkafka^] client, you must install some development tools locally on your computer or use a container runtime such as Podman or Docker to run a specified container image and configure a development environment. To learn more, see the https://github.com/nodeshift-starters/reactive-example/tree/node-rdkafka#node-rdkafka-and-kafkajs[documentation] for the example Node.js application.
@@ -118,12 +118,11 @@ endif::[]
118118
== Importing the Node.js sample code
119119

120120
[role="_abstract"]
121-
For this quick start, you'll use sample code from the _Nodeshift Application Starters_ https://github.com/nodeshift-starters/reactive-example[reactive-example^] repository in GitHub. After you understand the concepts and tasks in this quick start, you can use your own Node.js applications with {product-kafka} in the same way.
121+
For this quick start, you'll use sample code from the Nodeshift Application Starters https://github.com/nodeshift-starters/reactive-example[reactive-example^] repository in GitHub. After you understand the concepts and tasks in this quick start, you can use your own Node.js applications with {product-long-kafka} in the same way.
122122

123123
.Procedure
124124
. On the command line, clone the Nodeshift Application Starters https://github.com/nodeshift-starters/reactive-example[reactive-example^] repository from GitHub.
125125
+
126-
.Cloning the reactive-example repository
127126
[source,subs="+attributes"]
128127
----
129128
$ git clone https://github.com/nodeshift-starters/reactive-example.git
@@ -142,15 +141,15 @@ endif::[]
142141
To enable your Node.js application to access a Kafka instance, you must configure a connection by specifying the following details:
143142

144143
* The bootstrap server endpoint for your Kafka instance
145-
* The generated credentials for your {product-kafka} service account
144+
* The generated credentials for your {product-long-kafka} service account
146145
* The Simple Authentication and Security Layer (SASL) mechanism that the client will use to authenticate with the Kafka instance
147146

148-
In this task, you'll create a new configuration file called `rhoas.env`. In this file, you'll set the required bootstrap server and client credentials as environment variables.
147+
In this task, you'll create a new configuration file called `rhoas.env`. In the file, you'll set the required bootstrap server and client credentials as environment variables.
149148

150149
.Prerequisites
151150
ifndef::qs[]
152151
* You have the bootstrap server endpoint for your Kafka instance. To get the server endpoint, select your Kafka instance in the {product-kafka} web console, select the options menu (three vertical dots), and click *Connection*.
153-
* You have the generated credentials for your service account. To regenerate the credentials, use the *Service Accounts* page in the {product-kafka} web console to find your service account and update the credentials.
152+
* You have the generated credentials for your service account. To reset the credentials, use the https://console.redhat.com/application-services/service-accounts[Service Accounts^] page in the *Application Services* section of the Red Hat Hybrid Cloud Console.
154153
* You've set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, select your Kafka instance in the {product-kafka} web console and use the *Access* page to find your service account permission settings.
155154
endif::[]
156155

@@ -168,8 +167,6 @@ RHOAS_CLIENT_ID=__<client_id>__
168167
RHOAS_CLIENT_SECRET=__<client_secret>__
169168
KAFKA_SASL_MECHANISM=plain
170169
----
171-
+
172-
In the preceding example, replace the values in angle brackets (`< >`) with your own bootstrap server and client credential information.
173170
ifdef::qs[]
174171
+
175172
The values are described as follows:
@@ -197,29 +194,32 @@ endif::[]
197194
The Node.js application in this quick start uses a Kafka topic called `countries` to produce and consume messages. In this task, you'll create the topic in your Kafka instance.
198195

199196
.Prerequisites
200-
* You've created a Kafka instance in {product-kafka} and the instance is in the *Ready* state.
197+
* You've created a Kafka instance in {product-long-kafka} and the instance is in the *Ready* state.
201198

202199
.Procedure
203-
. In the {product-kafka} web console, go to *Streams for Apache Kafka* > *Kafka Instances* and click the name of the Kafka instance that you want to add a topic to.
204-
. Click *Create topic* and follow the guided steps to define the topic details. Click *Next* to complete each step and click *Finish* to complete the setup.
200+
. In the {product-kafka} web console, select *Kafka Instances* and then click the name of the Kafka instance that you want to add a topic to.
201+
. Select the *Topics* tab.
202+
. Click *Create topic* and follow the guided steps to define the topic details, as shown in the figure.
205203
+
206204
[.screencapture]
207205
.Guided steps to define topic details
208206
image::sak-create-countries-topic.png[Image of wizard to create a topic]
207+
+
208+
You must specify the following topic properties:
209209

210210
* *Topic name*: Enter `countries` as the topic name.
211211
* *Partitions*: Set the number of partitions for this topic. This example sets the partition to `1` for a single partition. Partitions are distinct lists of messages within a topic and enable parts of a topic to be distributed over multiple brokers in the cluster. A topic can contain one or more partitions, enabling producer and consumer loads to be scaled.
212212
* *Message retention*: Set the message retention time and size to the relevant value and increment. This example sets the retention time to `7 days` and the retention size to `Unlimited`. Message retention time is the amount of time that messages are retained in a topic before they are deleted or compacted, depending on the cleanup policy. Retention size is the maximum total size of all log segments in a partition before they are deleted or compacted.
213213
* *Replicas*: For this release of {product-kafka}, the replicas are preconfigured. The number of partition replicas for the topic is set to `3` and the minimum number of follower replicas that must be in sync with a partition leader is set to `2`. Replicas are copies of partitions in a topic. Partition replicas are distributed over multiple brokers in the cluster to ensure topic availability if a broker fails. When a follower replica is in sync with a partition leader, the follower replica can become the new partition leader if needed.
214214
+
215-
After you complete the topic setup, the new Kafka topic is listed in the topics table for your Kafka instance. You can now run the Node.js application to start producing and consuming messages.
215+
After you complete the setup, the new topic appears on the *Topics* page. You can now run the Node.js application to start producing and consuming messages.
216216

217217
.Verification
218218
ifdef::qs[]
219-
* Is the `countries` topic listed in the topics table?
219+
* Does the `countries` topic appear on the *Topics* page?
220220
endif::[]
221221
ifndef::qs[]
222-
* Verify that the `countries` topic is listed in the topics table.
222+
* Verify that the `countries` topic appears on the *Topics* page.
223223
endif::[]
224224

225225
[id="proc-running-nodejs-example-application_{context}"]
@@ -230,17 +230,16 @@ After you configure your Node.js application to connect to a Kafka instance, and
230230

231231
In this task, you'll run the following components of the Node.js application:
232232

233-
* A `producer-backend` component that generates random country names and sends these names to the Kafka topic.
234-
* A `consumer-backend` component that consumes the country names from the Kafka topic.
233+
* A `producer-backend` component that generates random country names and sends these names to the Kafka topic
234+
* A `consumer-backend` component that consumes the country names from the Kafka topic
235235

236236
.Prerequisites
237237
* You've configured the Node.js example application to connect to a Kafka instance.
238-
* You've created the `countries` Kafka topic.
238+
* You've created the `countries` topic.
239239

240240
.Procedure
241241
. On the command line, navigate to the `reactive-example` directory of the repository that you cloned.
242242
+
243-
.Navigating to the reactive-example directory
244243
[source]
245244
----
246245
$ cd reactive-example
@@ -257,21 +256,19 @@ $ npm install
257256

258257
. Run the consumer component.
259258
+
260-
.Running the consumer component
261259
[source]
262260
----
263261
$ node consumer.js
264262
----
265263
+
266-
You should see the Node.js application run and connect to the Kafka instance. However, because you haven't yet run the producer component, the consumer has no country names to display.
264+
You see the Node.js application run and connect to the Kafka instance. However, because you haven't yet run the producer component, the consumer has no country names to display.
267265
+
268266
If the application fails to run, review the error log in the command-line window and address any problems. Also, review the steps in this quick start to ensure that the application and Kafka topic are configured correctly.
269267

270268
. Open a second command-line window or tab.
271269

272270
. On the second command line, navigate to the `reactive-example` directory of the repository that you cloned.
273271
+
274-
.Navigating to the reactive-example directory
275272
[source]
276273
----
277274
$ cd reactive-example
@@ -288,18 +285,16 @@ $ npm install
288285

289286
. Run the producer component.
290287
+
291-
.Running the producer component
292288
[source]
293289
----
294290
$ node producer.js
295291
----
296292
+
297-
You should see output like that shown in the example.
293+
When the producer component runs, you see output like that shown in the following example:
298294
+
299295
.Example output from the producer component
300296
[source]
301297
----
302-
$ node producer.js
303298
Ghana
304299
Réunion
305300
Guatemala
@@ -315,12 +310,11 @@ As shown in the example, the producer component runs and generates messages that
315310

316311
. Switch back to the first command-line window.
317312
+
318-
You should now see that the consumer component displays the same country names generated by the producer, and in the same order, as shown in the example.
313+
You now see that the consumer component displays the same country names generated by the producer, and in the same order, as shown in the following example:
319314
+
320315
.Example output from the consumer component
321316
[source]
322317
----
323-
$ node consumer.js
324318
Ghana
325319
Réunion
326320
Guatemala
@@ -334,7 +328,7 @@ Haiti
334328
+
335329
The output from both components confirms that they successfully connected to the Kafka instance. The components are using the Kafka topic that you created to produce and consume messages.
336330
+
337-
NOTE: You can also use the {product-long-kafka} web console to browse messages in the Kafka topic. For more information, see {base-url}{message-browsing-url-kafka}[_Browsing messages in the {product-long-kafka} web console_^].
331+
NOTE: You can also use the {product-long-kafka} web console to browse messages in the Kafka topic. For more information, see {base-url}{message-browsing-url-kafka}[Browsing messages in the {product-long-kafka} web console^].
338332

339333
. In your IDE, in the `producer-backend` directory of the repository that you cloned, open the `producer.js` file.
340334
+

0 commit comments

Comments
 (0)