You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/connectors/getting-started-connectors/README.adoc
+35-30Lines changed: 35 additions & 30 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,27 +69,30 @@ In this example, you connect a data source (a data generator) that creates Kafka
69
69
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
70
70
ifdef::qs[]
71
71
[#description]
72
+
====
72
73
Learn how to create and set up connectors in {product-long-connectors}.
74
+
====
73
75
74
76
[#introduction]
75
-
Welcome to the quick start for {product-long-connectors}.
77
+
====
78
+
Welcome to the quick start for {product-long-connectors}.
76
79
77
-
In this quick start, you learn how to create a source connector and sink connector and send data to and from {product-kafka}.
78
-
79
-
A *source* connector allows you to send data from an external system to {product-kafka}. A *sink* connector allows you to send data from {product-kafka} to an external system.
80
+
In this quick start, you learn how to create a source connector and sink connector and send data to and from {product-kafka}.
80
81
82
+
A *source* connector allows you to send data from an external system to {product-kafka}. A *sink* connector allows you to send data from {product-kafka} to an external system.
83
+
====
81
84
endif::[]
82
85
83
86
ifndef::qs[]
84
87
== Overview
85
88
86
-
{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
89
+
{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
87
90
88
-
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
91
+
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
89
92
90
93
The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
91
94
92
-
[.screencapture]
95
+
[.screencapture]
93
96
.{product-long-connectors} data flow
94
97
image::connectors-diagram.png[Illustration of data flow from data source through Kafka to data sink]
95
98
@@ -101,29 +104,29 @@ endif::[]
101
104
[role="_abstract"]
102
105
Configure your {product-kafka} instance for use with {product-long-connectors} by:
103
106
104
-
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
105
-
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
107
+
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
108
+
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
106
109
* Setting up *access rules* for the service accounts that define how your Connectors can access and use the associated Kafka instance topics.
107
110
108
-
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
111
+
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
109
112
110
113
For this example, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
111
114
112
115
ifndef::qs[]
113
116
.Prerequisites
114
117
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
115
118
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
116
-
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
119
+
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
117
120
endif::[]
118
121
119
122
.Procedure
120
123
. Create a Kafka topic for your connectors:
121
124
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
122
125
.. Click the name of the Kafka instance that you want to add a topic to.
123
126
.. Select the *Topics* tab, and then click *Create topic*.
124
-
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
127
+
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
125
128
.. Accept the default settings for partitions, message retention, and replicas.
126
-
. Create a service account for connectors:
129
+
. Create a service account for connectors:
127
130
.. In the web console, select *Service Accounts*, and then click *Create service account*.
128
131
.. Type a unique service account name (for example, *test-service-acct* ) and then click *Create*.
129
132
.. Copy the generated *Client ID* and *Client Secret* to a secure location. You'll use these credentials to configure connections to this service account.
@@ -152,11 +155,11 @@ endif::[]
152
155
== Creating a Connectors instance for a data source
153
156
154
157
[role="_abstract"]
155
-
A *source* connector consumes events from an external data source and produces Kafka messages.
158
+
A *source* connector consumes events from an external data source and produces Kafka messages.
156
159
157
-
For this example, you create an instance of the *Data Generator* source connector.
160
+
For this example, you create an instance of the *Data Generator* source connector.
158
161
159
-
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
162
+
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
160
163
161
164
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
162
165
@@ -173,7 +176,7 @@ endif::[]
173
176
+
174
177
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
175
178
+
176
-
For example, to find the *Data Generator* source connector, type *data* in the search box. The list filters to show only the *Data Generator Connector* card.
179
+
For example, to find the *Data Generator* source connector, type *data* in the search box. The list filters to show only the *Data Generator Connector* card.
177
180
+
178
181
Click the card to select the connector, and then click *Next*.
179
182
@@ -193,7 +196,7 @@ NOTE: If you have not already configured a {product-kafka} instance for Connecto
193
196
. Click *Next*.
194
197
195
198
. Configure the core configuration for your connector:
196
-
.. Provide a name for the connector.
199
+
.. Provide a name for the connector.
197
200
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
198
201
199
202
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
@@ -203,11 +206,11 @@ NOTE: If you have not already configured a {product-kafka} instance for Connecto
203
206
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For example, type `Hello World!`.
204
207
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to send messages to the Kafka topic. For example, specify `10000`, to send a message every 10 seconds.
205
208
206
-
. Optionally, configure the error handling policy for your Connectors instance.
209
+
. Optionally, configure the error handling policy for your Connectors instance.
207
210
+
208
211
The options are:
209
212
+
210
-
* *stop* - (the default) The Connectors instance shuts down when it encounters an error.
213
+
* *stop* - (the default) The Connectors instance shuts down when it encounters an error.
211
214
* *log* - The Connectors instance sends errors to its log.
212
215
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
213
216
+
@@ -232,7 +235,7 @@ In the next procedure, you can verify that the source Connectors instance is sen
232
235
== Creating a Connectors instance for a data sink
233
236
234
237
[role="_abstract"]
235
-
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
238
+
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
236
239
237
240
For this example, you use the *HTTP Sink* connector which consumes the Kafka messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
238
241
@@ -244,18 +247,18 @@ ifndef::qs[]
244
247
endif::[]
245
248
246
249
.Procedure
247
-
248
-
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
250
+
251
+
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
249
252
250
253
. Select the sink connector that you want to use:
251
-
.. For example, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector.
252
-
.. Click the *HTTP Sink connector* card and then click *Next*.
254
+
.. For example, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector.
255
+
.. Click the *HTTP Sink connector* card and then click *Next*.
253
256
254
-
. Select the {product-kafka} instance for the connector to work with.
257
+
. Select the {product-kafka} instance for the connector to work with.
255
258
+
256
259
For example, select *test* and then click *Next*.
257
260
258
-
. On the *Namespace* page, click the *eval namespace* that you created when you created the source connector.
261
+
. On the *Namespace* page, click the *eval namespace* that you created when you created the source connector.
259
262
260
263
//. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
261
264
//+
@@ -266,7 +269,7 @@ For example, select *test* and then click *Next*.
266
269
. Click *Next*.
267
270
268
271
. Provide the core configuration for your connector:
269
-
.. Type a unique name for the connector.
272
+
.. Type a unique name for the connector.
270
273
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
271
274
272
275
. Provide the connector-specific configuration for your connector. For the *HTTP sink connector*, provide the following information:
@@ -280,7 +283,7 @@ For example, select *test* and then click *Next*.
280
283
281
284
. Review the summary of the configuration properties and then click *Create Connectors instance*.
282
285
+
283
-
Your Connectors instance is listed in the table of Connectors.
286
+
Your Connectors instance is listed in the table of Connectors.
284
287
+
285
288
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).
286
289
@@ -291,8 +294,10 @@ Open the browser tab to your custom URL for the link:https://webhook.site[webhoo
291
294
292
295
ifdef::qs[]
293
296
[#conclusion]
297
+
====
294
298
Congratulations! You successfully completed the {product-long-connectors} Getting Started quick start.
Copy file name to clipboardExpand all lines: docs/kafka/getting-started-kafka/README.adoc
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -76,10 +76,14 @@ endif::[]
76
76
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
77
77
ifdef::qs[]
78
78
[#description]
79
+
====
79
80
Learn how to create and set up your first Apache Kafka instance in {product-long-kafka}.
81
+
====
80
82
81
83
[#introduction]
84
+
====
82
85
Welcome to the quick start for {product-long-kafka}. In this quick start, you'll learn how to create and inspect a Kafka instance, create a service account to connect an application or service to the instance, and create a topic in the instance.
Learn how to use Kafka scripts to interact with a Kafka instance in {product-long-kafka}.
96
+
====
95
97
96
98
[#introduction]
99
+
====
97
100
Welcome to the quick start for {product-long-kafka} with Kafka scripts. In this quick start, you'll learn how to use the Kafka scripts to produce and consume messages for your Kafka instances in {product-kafka}.
Congratulations! You successfully completed the {product-kafka} Kafka scripts quick start, and are now ready to produce and consume messages in the service.
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
97
97
ifdef::qs[]
98
98
[#description]
99
+
====
99
100
Learn how to use Kafkacat to interact with a Kafka instance in {product-long-kafka}.
101
+
====
100
102
101
103
[#introduction]
104
+
====
102
105
Welcome to the quick start for {product-long-kafka} with Kafkacat. In this quick start, you'll learn how to use https://github.com/edenhill/kafkacat[Kafkacat^] to produce and consume messages for your Kafka instances in {product-kafka}.
106
+
====
103
107
endif::[]
104
108
105
109
[id="proc-configuring-kafkacat_{context}"]
@@ -112,7 +116,7 @@ For more information about Kafkacat configuration options, see https://github.co
112
116
113
117
NOTE: Kafkacat does not yet fully support SASL/OAUTHBEARER authentication, so connecting to a Kafka instance requires only the bootstrap server and the service account credentials for SASL/PLAIN authentication.
114
118
115
-
NOTE: Kafkacat have been recently renamed to kcat. If you use latest version of kafkacat please replace all occurences referencing kafkacat binary to kcat
119
+
NOTE: Kafkacat have been recently renamed to kcat. If you use latest version of kafkacat please replace all occurences referencing kafkacat binary to kcat
116
120
117
121
.Prerequisites
118
122
ifndef::qs[]
@@ -234,7 +238,9 @@ endif::[]
234
238
235
239
ifdef::qs[]
236
240
[#conclusion]
241
+
====
237
242
Congratulations! You successfully completed the {product-kafka} Kafkacat quick start, and are now ready to produce and consume messages in the service.
Copy file name to clipboardExpand all lines: docs/kafka/nodejs-kafka/README.adoc
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -81,10 +81,14 @@ The example Node.js application in this quick start uses the https://kafka.js.or
81
81
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
82
82
ifdef::qs[]
83
83
[#description]
84
+
====
84
85
Learn how to use Node.js applications to produce and consume messages using a Kafka instance in {product-long-kafka}.
86
+
====
85
87
86
88
[#introduction]
89
+
====
87
90
Welcome to the quick start for {product-long-kafka} with Node.js. In this quick start, you'll learn how to use the https://nodejs.org/en/about/[Node.js^] runtime to produce messages to and consume messages from your Kafka instances in {product-kafka}.
91
+
====
88
92
endif::[]
89
93
90
94
@@ -324,7 +328,9 @@ endif::[]
324
328
325
329
ifdef::qs[]
326
330
[#conclusion]
331
+
====
327
332
Congratulations! You successfully completed the {product-kafka} Node.js quick start. You're now ready to use your own Node.js applications with {product-kafka}.
Copy file name to clipboardExpand all lines: docs/kafka/quarkus-kafka/README.adoc
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -77,10 +77,14 @@ endif::[]
77
77
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
78
78
ifdef::qs[]
79
79
[#description]
80
+
====
80
81
Learn how to use Quarkus applications to produce messages to and consume messages from a Kafka instance in {product-long-kafka}.
82
+
====
81
83
82
84
[#introduction]
85
+
====
83
86
Welcome to the quick start for {product-long-kafka} with Quarkus. In this quick start, you'll learn how to use https://quarkus.io/[Quarkus^] to produce messages to and consume messages from your Kafka instances in {product-kafka}.
Congratulations! You successfully completed the {product-kafka} Quarkus quick start, and are now ready to use your own Quarkus applications with {product-kafka}.
Copy file name to clipboardExpand all lines: docs/registry/getting-started-registry/README.adoc
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -76,10 +76,14 @@ endif::[]
76
76
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
77
77
ifdef::qs[]
78
78
[#description]
79
+
====
79
80
Learn how to create and set up your first {registry} instance in {product-long-registry}.
81
+
====
80
82
81
83
[#introduction]
84
+
====
82
85
Welcome to the quick start for {product-long-registry}. In this quick start, you'll learn how to create and view a {registry} instance, create a schema in this instance, and create a service account to connect an application or service to this instance.
Copy file name to clipboardExpand all lines: docs/registry/quarkus-registry/README.adoc
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -82,10 +82,14 @@ endif::[]
82
82
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
83
83
ifdef::qs[]
84
84
[#description]
85
+
====
85
86
Learn how to use a Quarkus application that produces messages to and consume messages from a Kafka instance in {product-long-kafka} and manage the message schemas in {product-long-registry}.
87
+
====
86
88
87
89
[#introduction]
90
+
====
88
91
Welcome to the quick start for {product-long-registry} with Quarkus. In this quick start, you'll learn how to use https://quarkus.io/[Quarkus^] to produce messages to and consume messages from your Kafka instances in {product-kafka} and manage the message schemas in {product-long-registry}.
Congratulations! You successfully completed the {product-kafka} and {registry} Quarkus quick start, and are now ready to use your own Quarkus application with {product-kafka} and {registry}.
0 commit comments