Skip to content

Commit 57cafc2

Browse files
authored
docs: JR-486: Updates quick start prerequisites to include links (#598)
* docs: JR-486: Adds link in API Designer quick start prerequisites * docs: JR-486: Adds references to associated services in Service Registry quick start prerequisites * docs: JR-486: Adds links in Using Quarkus with Service Registry quick start prerequisites * docs: JR-486: Adds links in Using Quarkus with Kafka quick start prerequisites * docs: JR-486: Adds links in Configuring Kafkacat with Kafka quick start prerequisites * docs: JR-486: Adds links in Producing and Consuming Messages quick start prerequisites * docs: JR-486: Adds links in Configuring Kafka Scripts quick start prerequisites * docs: JR-486: Adds links in Using Node.js with Kafka quick start prerequisites * docs: JR-486: Adds command-line terminal prerequisite to README.adoc * docs: JR-486: Updates the prerequisites for the 'producing and consuming messages' doc * docs: JR-486: Updates the prerequisites for the 'producing and consuming messages' README.adoc * docs: JR-486: Removes redundant text from the quick-start prerequisites * docs: JR-486: Adds command-line terminal prerequisite to README.adoc * docs: JR-486: Updates README.adoc prerequisites to (a) standardise references to running Kafka instance, and (a) add link to doc * docs: JR-486: Updates README.adoc prerequisites to standardise references to Red Hat account using an attribute * docs: JR-486: Updates README.adoc to remove redundant prerequisite * docs: JR-486: Removes references to Quarkus version number * docs: JR-486: Implements reviewer feedback * docs: JR-486: Updates references to later Maven 3 versions * docs: JR-486: Removes '.x' from Node.js version
1 parent 9999c0e commit 57cafc2

File tree

27 files changed

+86
-70
lines changed

27 files changed

+86
-70
lines changed

docs/api-designer/getting-started-api-designer/quickstart.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ spec:
1616
data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz48c3ZnIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIgdmlld0JveD0iMCAwIDM4IDM4Ij48ZGVmcz48c3R5bGU+LmR7c3Ryb2tlOiMwMDA7fS5kLC5le2ZpbGw6bm9uZTtzdHJva2UtbGluZWNhcDpyb3VuZDtzdHJva2UtbGluZWpvaW46cm91bmQ7c3Ryb2tlLXdpZHRoOjEuMjVweDt9LmV7c3Ryb2tlOnJlZDt9LmZ7ZmlsbDojZmZmO308L3N0eWxlPjwvZGVmcz48ZyBpZD0iYSI+PGc+PHJlY3QgY2xhc3M9ImYiIHg9IjEuNjIiIHk9IjEuODEiIHdpZHRoPSIzNC43NSIgaGVpZ2h0PSIzNC43NSIgcng9IjguMzgiIHJ5PSI4LjM4Ii8+PHBhdGggZD0iTTI4LDIuNDNjNC4yNywwLDcuNzUsMy40OCw3Ljc1LDcuNzVWMjguMThjMCw0LjI3LTMuNDgsNy43NS03Ljc1LDcuNzVIMTBjLTQuMjcsMC03Ljc1LTMuNDgtNy43NS03Ljc1VjEwLjE4YzAtNC4yNywzLjQ4LTcuNzUsNy43NS03Ljc1SDI4bTAtMS4yNUgxMEM1LjAzLDEuMTgsMSw1LjIxLDEsMTAuMThWMjguMThjMCw0Ljk3LDQuMDMsOSw5LDlIMjhjNC45NywwLDktNC4wMyw5LTlWMTAuMThjMC00Ljk3LTQuMDMtOS05LTloMFoiLz48L2c+PC9nPjxnIGlkPSJiIj48Zz48cG9seWxpbmUgY2xhc3M9ImUiIHBvaW50cz0iMjAgMjcuMTUgMjQuMjggMjIuMjUgMjguNjMgMTcuMTcgMjcuODUgMTYuNDkgMjcuMDcgMTUuODIgMjIuNzEgMjAuOTEgMTguMzUgMjUuOTkiLz48cGF0aCBkPSJNMjIuODksMTMuOTdjLjEyLDAsLjI1LS4wNSwuMzQtLjE0bDEuNDEtMS40MWguOTNjLjU3LDIuMyw0LjAxLDEuOTEsNC4wMi0uNDktLjAxLTIuNC0zLjQ2LTIuNzktNC4wMi0uNDloLTEuMTRjLS4xMywwLS4yNSwuMDUtLjM0LC4xNGwtMS41NiwxLjU2Yy0uMzEsLjI5LS4wNywuODUsLjM0LC44M1ptNC42Ny0zLjExYzEuNDEsLjAyLDEuNDEsMi4xMiwwLDIuMTQtMS40MS0uMDItMS40MS0yLjEyLDAtMi4xNFoiLz48cGF0aCBkPSJNMTAuNDMsMTMuOTdjLjk2LDAsMS43Ni0uNjcsMS45OC0xLjU2aC45M2wxLjQxLDEuNDFjLjQ1LC40NSwxLjE0LS4yNCwuNjktLjY5bC0xLjU2LTEuNTZjLS4wOS0uMDktLjIyLS4xNC0uMzQtLjE0aC0xLjE0Yy0uNTctMi4zLTQuMDEtMS45MS00LjAyLC40OSwwLDEuMTMsLjkyLDIuMDQsMi4wNCwyLjA0Wm0wLTMuMTFjMS40MSwuMDIsMS40MSwyLjEyLDAsMi4xNC0xLjQxLS4wMi0xLjQxLTIuMTIsMC0yLjE0WiIvPjxwYXRoIGQ9Ik04Ljg4LDIwLjJoMy4xMWMuMjcsMCwuNDktLjIyLC40OS0uNDl2LTEuMDdoMS44NWMuNjQsMCwuNjQtLjk3LDAtLjk3aC0xLjg1di0xLjA3YzAtLjI3LS4yMi0uNDktLjQ5LS40OWgtMy4xMWMtLjI3LDAtLjQ5LC4yMi0uNDksLjQ5djMuMTFjMCwuMjcsLjIyLC40OSwuNDksLjQ5Wm0uNDktMy4xMWgyLjE0djIuMTRoLTIuMTR2LTIuMTRaIi8+PHBhdGggZD0iTTE0Ljc2LDIyLjQ4bC0xLjQxLDEuNDFoLS45M2MtLjU3LTIuMy00LjAxLTEuOTEtNC4wMiwuNDksLjAxLDIuNCwzLjQ2LDIuNzksNC4wMiwuNDloMS4xNGMuMTMsMCwuMjUtLjA1LC4zNC0uMTRsMS41Ni0xLjU2Yy40NS0uNDQtLjI1LTEuMTQtLjY5LS42OVptLTQuMzMsMi45N2MtMS40MS0uMDItMS40MS0yLjEyLDAtMi4xNCwxLjQxLC4wMiwxLjQxLDIuMTIsMCwyLjE0WiIvPjxwb2x5bGluZSBjbGFzcz0iZCIgcG9pbnRzPSIxOSAyMS4xNSAxNy41IDIxLjE1IDE2IDIxLjE1IDE2IDE5LjY1IDE2IDE4LjE1IDE2IDE2LjY1IDE2IDE1LjE1IDE3LjUgMTUuMTUgMTkgMTUuMTUgMjAuNSAxNS4xNSAyMiAxNS4xNSAyMiAxNi42NSAyMiAxOC4xNSIvPjxwb2x5bGluZSBjbGFzcz0iZSIgcG9pbnRzPSIxOC4yMyAyNi4xNSAxOC4xMiAyNy4xMyAxOCAyOC4xIDE4LjkgMjcuNzEgMTkuOCAyNy4zMyIvPjwvZz48L2c+PGcgaWQ9ImMiLz48L3N2Zz4=
1717
description: !snippet README.adoc#description
1818
prerequisites:
19-
- A running Service Registry instance (required only if you want to export to Service Registry, see the Getting started with Service Registry quick start)
19+
- A running Service Registry instance (required only if you want to export to Service Registry, see <a href="https://console.redhat.com/application-services/learning-resources?quickstart=getting-started-service-registry">Getting started with Service Registry</a>)
2020
introduction: !snippet README.adoc#introduction
2121
tasks:
2222
- !snippet/proc README.adoc#proc-creating-api-design

docs/connectors/rhoas-cli-getting-started-connectors/README.adoc

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ Use this guide to complete the following tasks:
103103

104104
.Prerequisites
105105

106-
* You have a Red Hat account.
106+
* You have a {org-name} account.
107107
* You've installed the latest version of the `rhoas` CLI. See {base-url}{installation-guide-url-cli}[Installing and configuring the rhoas CLI^].
108108
* You've completed the following tasks:
109109
+
@@ -113,42 +113,42 @@ Use this guide to complete the following tasks:
113113
[source,subs="+quotes"]
114114
+
115115
----
116-
$ rhoas kafka create --name=my-kafka-instance
116+
$ rhoas kafka create --name=my-kafka-instance
117117
----
118118

119119
** Verify that the Kafka instance is in the *Ready* state.
120120
+
121121
[source,subs="+quotes"]
122122
----
123-
$ rhoas context status kafka
123+
$ rhoas context status kafka
124124
----
125125

126126
** Create a service account and copy the service account ID and secret. You must use a service account to connect and authenticate your {product-connectors} instances with your Kafka instance.
127127
+
128128
[source,subs="+quotes"]
129129
----
130-
$ rhoas service-account create --file-format=json --short-description=test-service-account
130+
$ rhoas service-account create --file-format=json --short-description=test-service-account
131131
----
132132

133133
** For your Kafka instance, set the permissions for the service account to enable {connectors} instances (that are configured with the service account credentials) to produce and consume messages in any topic in the Kafka instance.
134134
+
135135
[source,subs="+quotes"]
136136
----
137-
$ rhoas kafka acl grant-access --producer --consumer --service-account=<service-acct-id> --topic all --group all
137+
$ rhoas kafka acl grant-access --producer --consumer --service-account=<service-acct-id> --topic all --group all
138138
----
139139

140140
** Create a Kafka topic named `test-topic`. The Kafka topic stores messages sent by producers (data sources) and makes them available to consumers (data sinks).
141141
+
142142
[source,subs="+quotes"]
143143
----
144-
$ rhoas kafka topic create --name=test-topic
144+
$ rhoas kafka topic create --name=test-topic
145145
----
146146

147147
[id="proc-create-connector-namespace_{context}"]
148148
== Creating a namespace to host your {connectors} instances
149149
[role="_abstract"]
150150

151-
A {connectors} namespace hosts your {connectors} instances.
151+
A {connectors} namespace hosts your {connectors} instances.
152152

153153
The namespace that you use depends on your OpenShift Dedicated environment.
154154

@@ -187,11 +187,11 @@ $ rhoas connector namespace list
187187
== Building connector configuration files
188188

189189
[role="_abstract"]
190-
Before you can create a {connectors} instance, you must build a configuration file that is based on a supported connector type that is listed in the {product-connectors} catalog.
190+
Before you can create a {connectors} instance, you must build a configuration file that is based on a supported connector type that is listed in the {product-connectors} catalog.
191191

192192
For this example, you want to create two types of connectors: a data generator (a source connector) and an HTTP sink connector.
193193

194-
You must build a configuration file for each connector type that you want to create. When you build a configuration file, the default file name is `connector.json`. Optionally, you can specify a different configuration file name.
194+
You must build a configuration file for each connector type that you want to create. When you build a configuration file, the default file name is `connector.json`. Optionally, you can specify a different configuration file name.
195195

196196
.Prerequisites
197197

@@ -220,9 +220,9 @@ $ rhoas connector type list --limit=100
220220
// +
221221
// [source,subs="+quotes"]
222222
// ----
223-
// rhoas connector type list --limit=70 --search=%sink%
223+
// rhoas connector type list --limit=70 --search=%sink%
224224
// ----
225-
//
225+
//
226226
// .. Filter the list to show only source connectors:
227227
// +
228228
// [source,subs="+quotes"]
@@ -300,11 +300,11 @@ $ rhoas connector build --name=test-http --type=http_sink_0.1 --output-file=test
300300

301301
.. For *Format*, press *ENTER* to accept the default (`application/octet-stream`).
302302

303-
.. For *Error handling method*, select `stop`.
303+
.. For *Error handling method*, select `stop`.
304304

305305
.. For *Method*, accept the default (`POST`).
306306

307-
.. For *URL*, paste your unique URL that you copied from the https://webhook.site[Webhook.site^] page.
307+
.. For *URL*, paste your unique URL that you copied from the https://webhook.site[Webhook.site^] page.
308308

309309
.. For *Topic Names*, type `test-topic`.
310310

@@ -343,7 +343,7 @@ For this example, you create two {connectors} instances: a data generator source
343343
+
344344
[source,subs="+quotes"]
345345
----
346-
$ rhoas connector create --file=test-generator.json
346+
$ rhoas connector create --file=test-generator.json
347347
----
348348

349349
. Answer the prompts for details about the {connectors} instance.
@@ -353,10 +353,10 @@ $ rhoas connector create --file=test-generator.json
353353
.. For *Service Account Client ID*, type or paste your ID.
354354

355355
.. For *Service Account Client Secret*, type or paste your secret.
356-
+
356+
+
357357
A message states "Successfully created the {connectors} instance".
358358

359-
. Wait until the status of the {connectors} instance is *Ready*.
359+
. Wait until the status of the {connectors} instance is *Ready*.
360360
+
361361
To check the status:
362362
+
@@ -376,7 +376,7 @@ $ rhoas kafka topic consume --name=test-topic --partition=0 --wait
376376
+
377377
[source,subs="+quotes"]
378378
----
379-
$ rhoas connector create --file=test-http.json
379+
$ rhoas connector create --file=test-http.json
380380
----
381381

382382
. Answer the prompts for details about the {connectors} instance.
@@ -389,7 +389,7 @@ $ rhoas connector create --file=test-http.json
389389
+
390390
A message states "Successfully created the {connectors} instance".
391391

392-
. Wait until the status of the {connectors} instance is *Ready*.
392+
. Wait until the status of the {connectors} instance is *Ready*.
393393
+
394394
To check the status:
395395
+
@@ -417,4 +417,4 @@ For more information about the `rhoas connector` commands that you can use to ma
417417
* {base-url-cli}{command-ref-url-cli}[_CLI command reference (rhoas)_^]
418418

419419
ifdef::parent-context[:context: {parent-context}]
420-
ifndef::parent-context[:!context:]
420+
ifndef::parent-context[:!context:]

docs/kafka/access-mgmt-kafka/README.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -243,7 +243,7 @@ https://kafka.apache.org/documentation/#security_authz_primitives[Authorization
243243
In {product-long-kafka}, you can create Access Control Lists (ACLs) in your Kafka instances and set permissions for how other user accounts or service accounts can interact with an instance and its resources. You can manage access for only the Kafka instances that you create or for the instances that the owner has enabled you to access and alter.
244244

245245
.Prerequisites
246-
* You've created a Kafka instance and the instance is in the *Ready* state.
246+
* You have a running Kafka instance in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
247247
* The user account or service account that you're setting permissions for has been created in the organization.
248248

249249
.Procedure

docs/kafka/consumer-configuration-kafka/README.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -256,7 +256,7 @@ If you're using Kafka scripts, you can use the `kafka-consumer-groups.sh` tool t
256256

257257

258258
.Prerequisites
259-
* You've created a Kafka instance with at least one Kafka topic in {product-kafka}.
259+
* You have a running Kafka instance with at least one Kafka topic in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
260260
* Consumer client applications connected to the Kafka instance have a consumer group ID.
261261

262262
.Procedure

docs/kafka/getting-started-kafka/README.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,7 @@ As a developer of applications and services, you can use {product-long-kafka} to
9696

9797
ifndef::community[]
9898
.Prerequisites
99-
* You have a Red Hat account.
99+
* You have a {org-name} account.
100100
//* You have a subscription to {product-long-kafka}. For more information about signing up, see *<@SME: Where to link?>*.
101101
endif::[]
102102

docs/kafka/kafka-bin-scripts-kafka/README.adoc

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,8 @@ With the scripts, you can produce and consume messages using your Kafka instance
102102
NOTE: The command examples in this quick start show how to use the Kafka scripts on Linux and macOS. If you're using Windows, use the Windows versions of the scripts. For example, instead of the `__<Kafka-distribution-dir>__/bin/kafka-console-producer.sh` script, use the `__<Kafka-distribution-dir>__\bin\windows\kafka-console-producer.bat` script.
103103

104104
.Prerequisites
105-
* You have a running Kafka instance in {product-kafka}.
105+
* You have a running Kafka instance in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
106+
* You have a command-line terminal application.
106107
* https://adoptopenjdk.net/[JDK^] 11 or later is installed. (The latest LTS version of OpenJDK is recommended.)
107108
* You've downloaded the latest supported binary version of the https://kafka.apache.org/downloads[Apache Kafka distribution^]. You can check your Kafka version using the following command:
108109
+
@@ -169,7 +170,7 @@ In this task, you use the `kafka-console-producer` script to produce messages to
169170

170171
.Prerequisites
171172

172-
* You have a running Kafka instance in {product-long-kafka}.
173+
* You have a running Kafka instance in {product-long-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
173174
* You have the bootstrap server endpoint for your Kafka instance. To get the server endpoint, select your Kafka instance in the {product-long-kafka} web console, select the options icon (three vertical dots), and click *Connection*.
174175
* You've created the `{property-file-name}` file to store your service account credentials.
175176

docs/kafka/kafka-bin-scripts-kafka/quickstart.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,10 @@ spec:
1515
data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4KPCEtLSBHZW5lcmF0b3I6IEFkb2JlIElsbHVzdHJhdG9yIDI1LjIuMCwgU1ZHIEV4cG9ydCBQbHVnLUluIC4gU1ZHIFZlcnNpb246IDYuMDAgQnVpbGQgMCkgIC0tPgo8c3ZnIHZlcnNpb249IjEuMSIgaWQ9IkxheWVyXzEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIgeG1sbnM6eGxpbms9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkveGxpbmsiIHg9IjBweCIgeT0iMHB4IgoJIHZpZXdCb3g9IjAgMCAzNyAzNyIgc3R5bGU9ImVuYWJsZS1iYWNrZ3JvdW5kOm5ldyAwIDAgMzcgMzc7IiB4bWw6c3BhY2U9InByZXNlcnZlIj4KPHN0eWxlIHR5cGU9InRleHQvY3NzIj4KCS5zdDB7ZmlsbDojRUUwMDAwO30KCS5zdDF7ZmlsbDojRkZGRkZGO30KPC9zdHlsZT4KPGc+Cgk8cGF0aCBkPSJNMjcuNSwwLjVoLTE4Yy00Ljk3LDAtOSw0LjAzLTksOXYxOGMwLDQuOTcsNC4wMyw5LDksOWgxOGM0Ljk3LDAsOS00LjAzLDktOXYtMThDMzYuNSw0LjUzLDMyLjQ3LDAuNSwyNy41LDAuNUwyNy41LDAuNXoiCgkJLz4KCTxwYXRoIGNsYXNzPSJzdDAiIGQ9Ik0xNi41LDE4LjEyYy0xLjcyLDAtMy4xMi0xLjQtMy4xMi0zLjEyczEuNC0zLjEyLDMuMTItMy4xMnMzLjEyLDEuNCwzLjEyLDMuMTJTMTguMjIsMTguMTIsMTYuNSwxOC4xMnoKCQkgTTE2LjUsMTMuMTJjLTEuMDMsMC0xLjg4LDAuODQtMS44OCwxLjg4czAuODQsMS44OCwxLjg4LDEuODhzMS44OC0wLjg0LDEuODgtMS44OFMxNy41MywxMy4xMiwxNi41LDEzLjEyeiIvPgoJPHBhdGggY2xhc3M9InN0MSIgZD0iTTEyLjk0LDExLjA2bC0yLTJjLTAuMDgtMC4wOC0wLjE4LTAuMTMtMC4yOS0wLjE1Yy0wLjAzLTAuMDEtMC4wNS0wLjAxLTAuMDctMC4wMQoJCWMtMC4xMS0wLjAxLTAuMjItMC4wMS0wLjMyLDAuMDNjMCwwLDAsMCwwLDBjLTAuMDcsMC4wMy0wLjEzLDAuMDctMC4xOCwwLjEyYy0wLjAxLDAuMDEtMC4wMSwwLjAxLTAuMDIsMC4wMWwtMiwyCgkJYy0wLjI0LDAuMjQtMC4yNCwwLjY0LDAsMC44OGMwLjEyLDAuMTIsMC4yOCwwLjE4LDAuNDQsMC4xOHMwLjMyLTAuMDYsMC40NC0wLjE4bDAuOTMtMC45M1YyMi41YzAsMC4zNSwwLjI4LDAuNjIsMC42MiwwLjYyCgkJczAuNjItMC4yOCwwLjYyLTAuNjJWMTEuMDFsMC45MywwLjkzYzAuMjQsMC4yNCwwLjY0LDAuMjQsMC44OCwwQzEzLjE5LDExLjcsMTMuMTksMTEuMywxMi45NCwxMS4wNnoiLz4KCTxwYXRoIGNsYXNzPSJzdDAiIGQ9Ik0yMi41LDE4LjEyYy0wLjM0LDAtMC42Mi0wLjI4LTAuNjItMC42MnYtNWMwLTAuMzUsMC4yOC0wLjYyLDAuNjItMC42MnMwLjYyLDAuMjgsMC42MiwwLjYydjUKCQlDMjMuMTIsMTcuODUsMjIuODQsMTguMTIsMjIuNSwxOC4xMnoiLz4KCTxwYXRoIGNsYXNzPSJzdDAiIGQ9Ik0yMC41LDI1LjEyYy0xLjcyLDAtMy4xMi0xLjQtMy4xMi0zLjEyczEuNC0zLjEyLDMuMTItMy4xMnMzLjEyLDEuNCwzLjEyLDMuMTJTMjIuMjIsMjUuMTIsMjAuNSwyNS4xMnoKCQkgTTIwLjUsMjAuMTJjLTEuMDMsMC0xLjg4LDAuODQtMS44OCwxLjg4czAuODQsMS44OCwxLjg4LDEuODhzMS44OC0wLjg0LDEuODgtMS44OFMyMS41MywyMC4xMiwyMC41LDIwLjEyeiIvPgoJPHBhdGggY2xhc3M9InN0MSIgZD0iTTI4Ljk0LDI1LjA2Yy0wLjI0LTAuMjQtMC42NC0wLjI0LTAuODgsMGwtMC45MywwLjkzVjEyLjVjMC0wLjM1LTAuMjgtMC42Mi0wLjYyLTAuNjJzLTAuNjIsMC4yOC0wLjYyLDAuNjIKCQl2MTMuNDlsLTAuOTMtMC45M2MtMC4yNC0wLjI0LTAuNjQtMC4yNC0wLjg4LDBjLTAuMjQsMC4yNC0wLjI0LDAuNjQsMCwwLjg4bDIsMmMwLjA2LDAuMDYsMC4xMywwLjExLDAuMjEsMC4xNAoJCWMwLjA4LDAuMDMsMC4xNiwwLjA1LDAuMjQsMC4wNWMwLjA4LDAsMC4xNi0wLjAyLDAuMjQtMC4wNWMwLDAsMCwwLDAsMGMwLjA3LTAuMDMsMC4xMy0wLjA3LDAuMTgtMC4xMgoJCWMwLjAxLTAuMDEsMC4wMS0wLjAxLDAuMDItMC4wMWwyLTJDMjkuMTksMjUuNywyOS4xOSwyNS4zLDI4Ljk0LDI1LjA2eiIvPgoJPHBhdGggY2xhc3M9InN0MCIgZD0iTTE0LjUsMjUuMTJjLTAuMzQsMC0wLjYyLTAuMjgtMC42Mi0wLjYydi01YzAtMC4zNSwwLjI4LTAuNjIsMC42Mi0wLjYyczAuNjIsMC4yOCwwLjYyLDAuNjJ2NQoJCUMxNS4xMiwyNC44NSwxNC44NCwyNS4xMiwxNC41LDI1LjEyeiIvPgoJPHBhdGggY2xhc3M9InN0MCIgZD0iTTI2LjUsMTguMTJjLTAuMzQsMC0wLjYyLTAuMjgtMC42Mi0wLjYydi01YzAtMC4zNSwwLjI4LTAuNjIsMC42Mi0wLjYyczAuNjIsMC4yOCwwLjYyLDAuNjJ2NQoJCUMyNy4xMiwxNy44NSwyNi44NCwxOC4xMiwyNi41LDE4LjEyeiIvPgoJPHBhdGggY2xhc3M9InN0MCIgZD0iTTEwLjUsMjUuMTJjLTAuMzQsMC0wLjYyLTAuMjgtMC42Mi0wLjYydi01YzAtMC4zNSwwLjI4LTAuNjIsMC42Mi0wLjYyczAuNjIsMC4yOCwwLjYyLDAuNjJ2NQoJCUMxMS4xMiwyNC44NSwxMC44NCwyNS4xMiwxMC41LDI1LjEyeiIvPgo8L2c+Cjwvc3ZnPgo=
1616
description: !snippet README.adoc#description
1717
prerequisites:
18-
- A running Kafka instance (see the Getting Started quick start)
18+
- A running Kafka instance (see <a href="https://console.redhat.com/application-services/learning-resources?quickstart=getting-started">Getting started with OpenShift Streams for Apache Kafka</a>)
1919
- The latest supported binary version of the Apache Kafka distribution
2020
- A command-line terminal application
21-
- JDK 11 or later (the latest LTS version of OpenJDK is recommended)
21+
- JDK 11 or later (the latest LTS version of OpenJDK is recommended)
2222
introduction: !snippet README.adoc#introduction
2323
tasks:
2424
- !snippet/proc README.adoc#proc-configuring-kafka-bin-scripts

docs/kafka/kafka-instance-settings/README.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -111,7 +111,7 @@ rhoas kafka update --reauthentication false
111111
For a list of Kafka instance settings that you can update using the CLI, see the `rhoas kafka update` entry in the {base-url-cli}{command-ref-url-cli}[CLI command reference (rhoas)^].
112112

113113
.Prerequisites
114-
* You have created a Kafka instance. To learn how to do this, see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^].
114+
* You have a running Kafka instance with at least one Kafka topic in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
115115

116116
.Procedure
117117
. In the {product-kafka} {service-url-kafka}[web console^], click *Kafka Instances* and select a Kafka instance.
@@ -134,7 +134,7 @@ You can edit the following Kafka instance settings in {product-long-kafka}.
134134
Connection re-authentication::
135135
+
136136
--
137-
When a client connects to a Kafka instance, the session lasts for five minutes.
137+
When a client connects to a Kafka instance, the session lasts for five minutes.
138138
At that point, the client must reauthenticate to stay connected.
139139
Many Kafka clients automatically reauthenticate to remain connected,
140140
but some Kafka clients do not.
@@ -157,4 +157,4 @@ You could also contact Red Hat Support for assistance.
157157
endif::[]
158158

159159
NOTE: Disabling connection re-authentication will restart your Kafka instance.
160-
--
160+
--

docs/kafka/kcat-kafka/README.adoc

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,8 @@ NOTE: Kcat is an open source community tool. Kcat is not a part of {product-kafk
9898
endif::[]
9999

100100
.Prerequisites
101-
* You have a running Kafka instance in {product-kafka}.
101+
* You have a running Kafka instance in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
102+
* You have a command-line terminal application.
102103
* https://adoptopenjdk.net/[JDK^] 11 or later is installed. (The latest LTS version of OpenJDK is recommended.)
103104
* You've installed the latest supported version of https://github.com/edenhill/kcat[Kcat^] for your operating system. To verify your Kcat version, enter the following command:
104105
+
@@ -107,7 +108,7 @@ endif::[]
107108
$ kcat -V
108109
----
109110
+
110-
You see output like the following example:
111+
You see output similar to the following example:
111112
+
112113
[source]
113114
----
@@ -167,7 +168,7 @@ You can use Kcat to produce messages to Kafka topics in several ways, such as re
167168

168169
.Prerequisites
169170
* Kcat is installed.
170-
* You have a running Kafka instance in {product-long-kafka}.
171+
* You have a running Kafka instance in {product-long-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
171172
* You have a topic in your Kafka instance that you can use to produce and consume messages.
172173
* You've set the Kafka bootstrap server endpoint and your service account credentials as environment variables.
173174

@@ -218,7 +219,7 @@ You can also use Kcat to consume messages from Kafka topics. In this task, you u
218219

219220
.Prerequisites
220221
* Kcat is installed.
221-
* You have a running Kafka instance in {product-long-kafka}.
222+
* You have a running Kafka instance in {product-long-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
222223
* You used Kcat to produce example messages to a topic in your Kafka instance.
223224

224225
.Procedure

0 commit comments

Comments
 (0)