You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs: JR-486: Updates quick start prerequisites to include links (#598)
* docs: JR-486: Adds link in API Designer quick start prerequisites
* docs: JR-486: Adds references to associated services in Service Registry quick start prerequisites
* docs: JR-486: Adds links in Using Quarkus with Service Registry quick start prerequisites
* docs: JR-486: Adds links in Using Quarkus with Kafka quick start prerequisites
* docs: JR-486: Adds links in Configuring Kafkacat with Kafka quick start prerequisites
* docs: JR-486: Adds links in Producing and Consuming Messages quick start prerequisites
* docs: JR-486: Adds links in Configuring Kafka Scripts quick start prerequisites
* docs: JR-486: Adds links in Using Node.js with Kafka quick start prerequisites
* docs: JR-486: Adds command-line terminal prerequisite to README.adoc
* docs: JR-486: Updates the prerequisites for the 'producing and consuming messages' doc
* docs: JR-486: Updates the prerequisites for the 'producing and consuming messages' README.adoc
* docs: JR-486: Removes redundant text from the quick-start prerequisites
* docs: JR-486: Adds command-line terminal prerequisite to README.adoc
* docs: JR-486: Updates README.adoc prerequisites to (a) standardise references to running Kafka instance, and (a) add link to doc
* docs: JR-486: Updates README.adoc prerequisites to standardise references to Red Hat account using an attribute
* docs: JR-486: Updates README.adoc to remove redundant prerequisite
* docs: JR-486: Removes references to Quarkus version number
* docs: JR-486: Implements reviewer feedback
* docs: JR-486: Updates references to later Maven 3 versions
* docs: JR-486: Removes '.x' from Node.js version
- A running Service Registry instance (required only if you want to export to Service Registry, see the Getting started with Service Registry quick start)
19
+
- A running Service Registry instance (required only if you want to export to Service Registry, see <a href="https://console.redhat.com/application-services/learning-resources?quickstart=getting-started-service-registry">Getting started with Service Registry</a>)
Copy file name to clipboardExpand all lines: docs/connectors/rhoas-cli-getting-started-connectors/README.adoc
+19-19Lines changed: 19 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -103,7 +103,7 @@ Use this guide to complete the following tasks:
103
103
104
104
.Prerequisites
105
105
106
-
* You have a Red Hat account.
106
+
* You have a {org-name} account.
107
107
* You've installed the latest version of the `rhoas` CLI. See {base-url}{installation-guide-url-cli}[Installing and configuring the rhoas CLI^].
108
108
* You've completed the following tasks:
109
109
+
@@ -113,42 +113,42 @@ Use this guide to complete the following tasks:
113
113
[source,subs="+quotes"]
114
114
+
115
115
----
116
-
$ rhoas kafka create --name=my-kafka-instance
116
+
$ rhoas kafka create --name=my-kafka-instance
117
117
----
118
118
119
119
** Verify that the Kafka instance is in the *Ready* state.
120
120
+
121
121
[source,subs="+quotes"]
122
122
----
123
-
$ rhoas context status kafka
123
+
$ rhoas context status kafka
124
124
----
125
125
126
126
** Create a service account and copy the service account ID and secret. You must use a service account to connect and authenticate your {product-connectors} instances with your Kafka instance.
** For your Kafka instance, set the permissions for the service account to enable {connectors} instances (that are configured with the service account credentials) to produce and consume messages in any topic in the Kafka instance.
134
134
+
135
135
[source,subs="+quotes"]
136
136
----
137
-
$ rhoas kafka acl grant-access --producer --consumer --service-account=<service-acct-id> --topic all --group all
137
+
$ rhoas kafka acl grant-access --producer --consumer --service-account=<service-acct-id> --topic all --group all
138
138
----
139
139
140
140
** Create a Kafka topic named `test-topic`. The Kafka topic stores messages sent by producers (data sources) and makes them available to consumers (data sinks).
141
141
+
142
142
[source,subs="+quotes"]
143
143
----
144
-
$ rhoas kafka topic create --name=test-topic
144
+
$ rhoas kafka topic create --name=test-topic
145
145
----
146
146
147
147
[id="proc-create-connector-namespace_{context}"]
148
148
== Creating a namespace to host your {connectors} instances
149
149
[role="_abstract"]
150
150
151
-
A {connectors} namespace hosts your {connectors} instances.
151
+
A {connectors} namespace hosts your {connectors} instances.
152
152
153
153
The namespace that you use depends on your OpenShift Dedicated environment.
154
154
@@ -187,11 +187,11 @@ $ rhoas connector namespace list
187
187
== Building connector configuration files
188
188
189
189
[role="_abstract"]
190
-
Before you can create a {connectors} instance, you must build a configuration file that is based on a supported connector type that is listed in the {product-connectors} catalog.
190
+
Before you can create a {connectors} instance, you must build a configuration file that is based on a supported connector type that is listed in the {product-connectors} catalog.
191
191
192
192
For this example, you want to create two types of connectors: a data generator (a source connector) and an HTTP sink connector.
193
193
194
-
You must build a configuration file for each connector type that you want to create. When you build a configuration file, the default file name is `connector.json`. Optionally, you can specify a different configuration file name.
194
+
You must build a configuration file for each connector type that you want to create. When you build a configuration file, the default file name is `connector.json`. Optionally, you can specify a different configuration file name.
195
195
196
196
.Prerequisites
197
197
@@ -220,9 +220,9 @@ $ rhoas connector type list --limit=100
220
220
// +
221
221
// [source,subs="+quotes"]
222
222
// ----
223
-
// rhoas connector type list --limit=70 --search=%sink%
223
+
// rhoas connector type list --limit=70 --search=%sink%
224
224
// ----
225
-
//
225
+
//
226
226
// .. Filter the list to show only source connectors:
In {product-long-kafka}, you can create Access Control Lists (ACLs) in your Kafka instances and set permissions for how other user accounts or service accounts can interact with an instance and its resources. You can manage access for only the Kafka instances that you create or for the instances that the owner has enabled you to access and alter.
244
244
245
245
.Prerequisites
246
-
* You've created a Kafka instance and the instance is in the *Ready* state.
246
+
* You have a running Kafka instance in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
247
247
* The user account or service account that you're setting permissions for has been created in the organization.
Copy file name to clipboardExpand all lines: docs/kafka/consumer-configuration-kafka/README.adoc
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -256,7 +256,7 @@ If you're using Kafka scripts, you can use the `kafka-consumer-groups.sh` tool t
256
256
257
257
258
258
.Prerequisites
259
-
* You've created a Kafka instance with at least one Kafka topic in {product-kafka}.
259
+
* You have a running Kafka instance with at least one Kafka topic in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
260
260
* Consumer client applications connected to the Kafka instance have a consumer group ID.
Copy file name to clipboardExpand all lines: docs/kafka/kafka-bin-scripts-kafka/README.adoc
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -102,7 +102,8 @@ With the scripts, you can produce and consume messages using your Kafka instance
102
102
NOTE: The command examples in this quick start show how to use the Kafka scripts on Linux and macOS. If you're using Windows, use the Windows versions of the scripts. For example, instead of the `__<Kafka-distribution-dir>__/bin/kafka-console-producer.sh` script, use the `__<Kafka-distribution-dir>__\bin\windows\kafka-console-producer.bat` script.
103
103
104
104
.Prerequisites
105
-
* You have a running Kafka instance in {product-kafka}.
105
+
* You have a running Kafka instance in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
106
+
* You have a command-line terminal application.
106
107
* https://adoptopenjdk.net/[JDK^] 11 or later is installed. (The latest LTS version of OpenJDK is recommended.)
107
108
* You've downloaded the latest supported binary version of the https://kafka.apache.org/downloads[Apache Kafka distribution^]. You can check your Kafka version using the following command:
108
109
+
@@ -169,7 +170,7 @@ In this task, you use the `kafka-console-producer` script to produce messages to
169
170
170
171
.Prerequisites
171
172
172
-
* You have a running Kafka instance in {product-long-kafka}.
173
+
* You have a running Kafka instance in {product-long-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
173
174
* You have the bootstrap server endpoint for your Kafka instance. To get the server endpoint, select your Kafka instance in the {product-long-kafka} web console, select the options icon (three vertical dots), and click *Connection*.
174
175
* You've created the `{property-file-name}` file to store your service account credentials.
- A running Kafka instance (see the Getting Started quick start)
18
+
- A running Kafka instance (see <a href="https://console.redhat.com/application-services/learning-resources?quickstart=getting-started">Getting started with OpenShift Streams for Apache Kafka</a>)
19
19
- The latest supported binary version of the Apache Kafka distribution
20
20
- A command-line terminal application
21
-
- JDK 11 or later (the latest LTS version of OpenJDK is recommended)
21
+
- JDK 11 or later (the latest LTS version of OpenJDK is recommended)
For a list of Kafka instance settings that you can update using the CLI, see the `rhoas kafka update` entry in the {base-url-cli}{command-ref-url-cli}[CLI command reference (rhoas)^].
112
112
113
113
.Prerequisites
114
-
* You have created a Kafka instance. To learn how to do this, see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^].
114
+
* You have a running Kafka instance with at least one Kafka topic in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
115
115
116
116
.Procedure
117
117
. In the {product-kafka} {service-url-kafka}[web console^], click *Kafka Instances* and select a Kafka instance.
@@ -134,7 +134,7 @@ You can edit the following Kafka instance settings in {product-long-kafka}.
134
134
Connection re-authentication::
135
135
+
136
136
--
137
-
When a client connects to a Kafka instance, the session lasts for five minutes.
137
+
When a client connects to a Kafka instance, the session lasts for five minutes.
138
138
At that point, the client must reauthenticate to stay connected.
139
139
Many Kafka clients automatically reauthenticate to remain connected,
140
140
but some Kafka clients do not.
@@ -157,4 +157,4 @@ You could also contact Red Hat Support for assistance.
157
157
endif::[]
158
158
159
159
NOTE: Disabling connection re-authentication will restart your Kafka instance.
Copy file name to clipboardExpand all lines: docs/kafka/kcat-kafka/README.adoc
+5-4Lines changed: 5 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -98,7 +98,8 @@ NOTE: Kcat is an open source community tool. Kcat is not a part of {product-kafk
98
98
endif::[]
99
99
100
100
.Prerequisites
101
-
* You have a running Kafka instance in {product-kafka}.
101
+
* You have a running Kafka instance in {product-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
102
+
* You have a command-line terminal application.
102
103
* https://adoptopenjdk.net/[JDK^] 11 or later is installed. (The latest LTS version of OpenJDK is recommended.)
103
104
* You've installed the latest supported version of https://github.com/edenhill/kcat[Kcat^] for your operating system. To verify your Kcat version, enter the following command:
104
105
+
@@ -107,7 +108,7 @@ endif::[]
107
108
$ kcat -V
108
109
----
109
110
+
110
-
You see output like the following example:
111
+
You see output similar to the following example:
111
112
+
112
113
[source]
113
114
----
@@ -167,7 +168,7 @@ You can use Kcat to produce messages to Kafka topics in several ways, such as re
167
168
168
169
.Prerequisites
169
170
* Kcat is installed.
170
-
* You have a running Kafka instance in {product-long-kafka}.
171
+
* You have a running Kafka instance in {product-long-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
171
172
* You have a topic in your Kafka instance that you can use to produce and consume messages.
172
173
* You've set the Kafka bootstrap server endpoint and your service account credentials as environment variables.
173
174
@@ -218,7 +219,7 @@ You can also use Kcat to consume messages from Kafka topics. In this task, you u
218
219
219
220
.Prerequisites
220
221
* Kcat is installed.
221
-
* You have a running Kafka instance in {product-long-kafka}.
222
+
* You have a running Kafka instance in {product-long-kafka} (see {base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]).
222
223
* You used Kcat to produce example messages to a topic in your Kafka instance.
0 commit comments