Skip to content

Commit 5f2605d

Browse files
authored
Fix all QS tag blocks. (#495)
1 parent afebcfa commit 5f2605d

File tree

8 files changed

+76
-31
lines changed

8 files changed

+76
-31
lines changed

docs/connectors/getting-started-connectors/README.adoc

Lines changed: 35 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -69,27 +69,30 @@ In this example, you connect a data source (a data generator) that creates Kafka
6969
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
7070
ifdef::qs[]
7171
[#description]
72+
====
7273
Learn how to create and set up connectors in {product-long-connectors}.
74+
====
7375

7476
[#introduction]
75-
Welcome to the quick start for {product-long-connectors}.
77+
====
78+
Welcome to the quick start for {product-long-connectors}.
7679
77-
In this quick start, you learn how to create a source connector and sink connector and send data to and from {product-kafka}.
78-
79-
A *source* connector allows you to send data from an external system to {product-kafka}. A *sink* connector allows you to send data from {product-kafka} to an external system.
80+
In this quick start, you learn how to create a source connector and sink connector and send data to and from {product-kafka}.
8081
82+
A *source* connector allows you to send data from an external system to {product-kafka}. A *sink* connector allows you to send data from {product-kafka} to an external system.
83+
====
8184
endif::[]
8285

8386
ifndef::qs[]
8487
== Overview
8588

86-
{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
89+
{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
8790

88-
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
91+
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
8992

9093
The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
9194

92-
[.screencapture]
95+
[.screencapture]
9396
.{product-long-connectors} data flow
9497
image::connectors-diagram.png[Illustration of data flow from data source through Kafka to data sink]
9598

@@ -101,29 +104,29 @@ endif::[]
101104
[role="_abstract"]
102105
Configure your {product-kafka} instance for use with {product-long-connectors} by:
103106

104-
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
105-
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
107+
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
108+
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
106109
* Setting up *access rules* for the service accounts that define how your Connectors can access and use the associated Kafka instance topics.
107110

108-
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
111+
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
109112

110113
For this example, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
111114

112115
ifndef::qs[]
113116
.Prerequisites
114117
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
115118
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
116-
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
119+
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
117120
endif::[]
118121

119122
.Procedure
120123
. Create a Kafka topic for your connectors:
121124
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
122125
.. Click the name of the Kafka instance that you want to add a topic to.
123126
.. Select the *Topics* tab, and then click *Create topic*.
124-
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
127+
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
125128
.. Accept the default settings for partitions, message retention, and replicas.
126-
. Create a service account for connectors:
129+
. Create a service account for connectors:
127130
.. In the web console, select *Service Accounts*, and then click *Create service account*.
128131
.. Type a unique service account name (for example, *test-service-acct* ) and then click *Create*.
129132
.. Copy the generated *Client ID* and *Client Secret* to a secure location. You'll use these credentials to configure connections to this service account.
@@ -152,11 +155,11 @@ endif::[]
152155
== Creating a Connectors instance for a data source
153156

154157
[role="_abstract"]
155-
A *source* connector consumes events from an external data source and produces Kafka messages.
158+
A *source* connector consumes events from an external data source and produces Kafka messages.
156159

157-
For this example, you create an instance of the *Data Generator* source connector.
160+
For this example, you create an instance of the *Data Generator* source connector.
158161

159-
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
162+
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
160163

161164
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
162165

@@ -173,7 +176,7 @@ endif::[]
173176
+
174177
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
175178
+
176-
For example, to find the *Data Generator* source connector, type *data* in the search box. The list filters to show only the *Data Generator Connector* card.
179+
For example, to find the *Data Generator* source connector, type *data* in the search box. The list filters to show only the *Data Generator Connector* card.
177180
+
178181
Click the card to select the connector, and then click *Next*.
179182

@@ -193,7 +196,7 @@ NOTE: If you have not already configured a {product-kafka} instance for Connecto
193196
. Click *Next*.
194197

195198
. Configure the core configuration for your connector:
196-
.. Provide a name for the connector.
199+
.. Provide a name for the connector.
197200
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
198201

199202
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
@@ -203,11 +206,11 @@ NOTE: If you have not already configured a {product-kafka} instance for Connecto
203206
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For example, type `Hello World!`.
204207
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to send messages to the Kafka topic. For example, specify `10000`, to send a message every 10 seconds.
205208

206-
. Optionally, configure the error handling policy for your Connectors instance.
209+
. Optionally, configure the error handling policy for your Connectors instance.
207210
+
208211
The options are:
209212
+
210-
* *stop* - (the default) The Connectors instance shuts down when it encounters an error.
213+
* *stop* - (the default) The Connectors instance shuts down when it encounters an error.
211214
* *log* - The Connectors instance sends errors to its log.
212215
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
213216
+
@@ -232,7 +235,7 @@ In the next procedure, you can verify that the source Connectors instance is sen
232235
== Creating a Connectors instance for a data sink
233236

234237
[role="_abstract"]
235-
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
238+
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
236239

237240
For this example, you use the *HTTP Sink* connector which consumes the Kafka messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
238241

@@ -244,18 +247,18 @@ ifndef::qs[]
244247
endif::[]
245248

246249
.Procedure
247-
248-
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
250+
251+
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
249252

250253
. Select the sink connector that you want to use:
251-
.. For example, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector.
252-
.. Click the *HTTP Sink connector* card and then click *Next*.
254+
.. For example, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector.
255+
.. Click the *HTTP Sink connector* card and then click *Next*.
253256

254-
. Select the {product-kafka} instance for the connector to work with.
257+
. Select the {product-kafka} instance for the connector to work with.
255258
+
256259
For example, select *test* and then click *Next*.
257260

258-
. On the *Namespace* page, click the *eval namespace* that you created when you created the source connector.
261+
. On the *Namespace* page, click the *eval namespace* that you created when you created the source connector.
259262

260263
//. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
261264
//+
@@ -266,7 +269,7 @@ For example, select *test* and then click *Next*.
266269
. Click *Next*.
267270

268271
. Provide the core configuration for your connector:
269-
.. Type a unique name for the connector.
272+
.. Type a unique name for the connector.
270273
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
271274

272275
. Provide the connector-specific configuration for your connector. For the *HTTP sink connector*, provide the following information:
@@ -280,7 +283,7 @@ For example, select *test* and then click *Next*.
280283

281284
. Review the summary of the configuration properties and then click *Create Connectors instance*.
282285
+
283-
Your Connectors instance is listed in the table of Connectors.
286+
Your Connectors instance is listed in the table of Connectors.
284287
+
285288
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).
286289

@@ -291,8 +294,10 @@ Open the browser tab to your custom URL for the link:https://webhook.site[webhoo
291294

292295
ifdef::qs[]
293296
[#conclusion]
297+
====
294298
Congratulations! You successfully completed the {product-long-connectors} Getting Started quick start.
299+
====
295300
endif::[]
296301

297302
ifdef::parent-context[:context: {parent-context}]
298-
ifndef::parent-context[:!context:]
303+
ifndef::parent-context[:!context:]

docs/kafka/getting-started-kafka/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -76,10 +76,14 @@ endif::[]
7676
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
7777
ifdef::qs[]
7878
[#description]
79+
====
7980
Learn how to create and set up your first Apache Kafka instance in {product-long-kafka}.
81+
====
8082

8183
[#introduction]
84+
====
8285
Welcome to the quick start for {product-long-kafka}. In this quick start, you'll learn how to create and inspect a Kafka instance, create a service account to connect an application or service to the instance, and create a topic in the instance.
86+
====
8387
endif::[]
8488

8589
[id="proc-creating-kafka-instance_{context}"]

docs/kafka/kafka-bin-scripts-kafka/README.adoc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -91,10 +91,14 @@ $ ./kafka-console-producer.sh --version
9191

9292
ifdef::qs[]
9393
[#description]
94+
====
9495
Learn how to use Kafka scripts to interact with a Kafka instance in {product-long-kafka}.
96+
====
9597

9698
[#introduction]
99+
====
97100
Welcome to the quick start for {product-long-kafka} with Kafka scripts. In this quick start, you'll learn how to use the Kafka scripts to produce and consume messages for your Kafka instances in {product-kafka}.
101+
====
98102
endif::[]
99103

100104
[id="proc-configuring-kafka-bin-scripts_{context}"]
@@ -250,7 +254,9 @@ endif::[]
250254

251255
ifdef::qs[]
252256
[#conclusion]
257+
====
253258
Congratulations! You successfully completed the {product-kafka} Kafka scripts quick start, and are now ready to produce and consume messages in the service.
259+
====
254260
endif::[]
255261

256262
ifdef::parent-context[:context: {parent-context}]

docs/kafka/kcat-kafka/README.adoc

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -96,10 +96,14 @@ Version 1.6.0 (JSON, Avro, Transactions, librdkafka 1.6.1 builtin.features=gzip,
9696
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
9797
ifdef::qs[]
9898
[#description]
99+
====
99100
Learn how to use Kafkacat to interact with a Kafka instance in {product-long-kafka}.
101+
====
100102

101103
[#introduction]
104+
====
102105
Welcome to the quick start for {product-long-kafka} with Kafkacat. In this quick start, you'll learn how to use https://github.com/edenhill/kafkacat[Kafkacat^] to produce and consume messages for your Kafka instances in {product-kafka}.
106+
====
103107
endif::[]
104108

105109
[id="proc-configuring-kafkacat_{context}"]
@@ -112,7 +116,7 @@ For more information about Kafkacat configuration options, see https://github.co
112116

113117
NOTE: Kafkacat does not yet fully support SASL/OAUTHBEARER authentication, so connecting to a Kafka instance requires only the bootstrap server and the service account credentials for SASL/PLAIN authentication.
114118

115-
NOTE: Kafkacat have been recently renamed to kcat. If you use latest version of kafkacat please replace all occurences referencing kafkacat binary to kcat
119+
NOTE: Kafkacat have been recently renamed to kcat. If you use latest version of kafkacat please replace all occurences referencing kafkacat binary to kcat
116120

117121
.Prerequisites
118122
ifndef::qs[]
@@ -234,7 +238,9 @@ endif::[]
234238

235239
ifdef::qs[]
236240
[#conclusion]
241+
====
237242
Congratulations! You successfully completed the {product-kafka} Kafkacat quick start, and are now ready to produce and consume messages in the service.
243+
====
238244
endif::[]
239245

240246
ifdef::parent-context[:context: {parent-context}]

docs/kafka/nodejs-kafka/README.adoc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,10 +81,14 @@ The example Node.js application in this quick start uses the https://kafka.js.or
8181
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
8282
ifdef::qs[]
8383
[#description]
84+
====
8485
Learn how to use Node.js applications to produce and consume messages using a Kafka instance in {product-long-kafka}.
86+
====
8587

8688
[#introduction]
89+
====
8790
Welcome to the quick start for {product-long-kafka} with Node.js. In this quick start, you'll learn how to use the https://nodejs.org/en/about/[Node.js^] runtime to produce messages to and consume messages from your Kafka instances in {product-kafka}.
91+
====
8892
endif::[]
8993

9094

@@ -324,7 +328,9 @@ endif::[]
324328

325329
ifdef::qs[]
326330
[#conclusion]
331+
====
327332
Congratulations! You successfully completed the {product-kafka} Node.js quick start. You're now ready to use your own Node.js applications with {product-kafka}.
333+
====
328334
endif::[]
329335

330336
ifdef::parent-context[:context: {parent-context}]

docs/kafka/quarkus-kafka/README.adoc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,10 +77,14 @@ endif::[]
7777
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
7878
ifdef::qs[]
7979
[#description]
80+
====
8081
Learn how to use Quarkus applications to produce messages to and consume messages from a Kafka instance in {product-long-kafka}.
82+
====
8183

8284
[#introduction]
85+
====
8386
Welcome to the quick start for {product-long-kafka} with Quarkus. In this quick start, you'll learn how to use https://quarkus.io/[Quarkus^] to produce messages to and consume messages from your Kafka instances in {product-kafka}.
87+
====
8488
endif::[]
8589

8690
[id="proc-importing-quarkus-sample-code_{context}"]
@@ -212,7 +216,9 @@ endif::[]
212216

213217
ifdef::qs[]
214218
[#conclusion]
219+
====
215220
Congratulations! You successfully completed the {product-kafka} Quarkus quick start, and are now ready to use your own Quarkus applications with {product-kafka}.
221+
====
216222
endif::[]
217223

218224
ifdef::parent-context[:context: {parent-context}]

docs/registry/getting-started-registry/README.adoc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -76,10 +76,14 @@ endif::[]
7676
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
7777
ifdef::qs[]
7878
[#description]
79+
====
7980
Learn how to create and set up your first {registry} instance in {product-long-registry}.
81+
====
8082

8183
[#introduction]
84+
====
8285
Welcome to the quick start for {product-long-registry}. In this quick start, you'll learn how to create and view a {registry} instance, create a schema in this instance, and create a service account to connect an application or service to this instance.
86+
====
8387
endif::[]
8488

8589
[id="proc-creating-service-registry-instance_{context}"]
@@ -272,7 +276,9 @@ endif::[]
272276

273277
ifdef::qs[]
274278
[#conclusion]
279+
====
275280
Congratulations! You successfully completed the {registry} Getting Started quick start, and are now ready to use the service.
281+
====
276282
endif::[]
277283

278284
ifdef::parent-context[:context: {parent-context}]

docs/registry/quarkus-registry/README.adoc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -82,10 +82,14 @@ endif::[]
8282
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
8383
ifdef::qs[]
8484
[#description]
85+
====
8586
Learn how to use a Quarkus application that produces messages to and consume messages from a Kafka instance in {product-long-kafka} and manage the message schemas in {product-long-registry}.
87+
====
8688

8789
[#introduction]
90+
====
8891
Welcome to the quick start for {product-long-registry} with Quarkus. In this quick start, you'll learn how to use https://quarkus.io/[Quarkus^] to produce messages to and consume messages from your Kafka instances in {product-kafka} and manage the message schemas in {product-long-registry}.
92+
====
8993
endif::[]
9094

9195
[id="proc-importing-quarkus-registry-sample-code_{context}"]
@@ -253,7 +257,9 @@ endif::[]
253257

254258
ifdef::qs[]
255259
[#conclusion]
260+
====
256261
Congratulations! You successfully completed the {product-kafka} and {registry} Quarkus quick start, and are now ready to use your own Quarkus application with {product-kafka} and {registry}.
262+
====
257263
endif::[]
258264

259265
ifdef::parent-context[:context: {parent-context}]

0 commit comments

Comments
 (0)