You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/connectors/getting-started-connectors/README.adoc
+31-22Lines changed: 31 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -114,29 +114,34 @@ endif::[]
114
114
== Configuring the {product-kafka} instance for use with {product-long-connectors}
115
115
116
116
[role="_abstract"]
117
-
Configure your {product-kafka} instancefor use with {product-long-connectors} by:
117
+
After you create a {product-kafka} instance, configure it for use with {product-long-connectors} by performing the following tasks:
118
118
119
-
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
120
-
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
121
-
* Setting up *access rules* for the service accounts that define how your Connectors can access and use the associated Kafka instance topics.
119
+
* Create *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
120
+
* Create *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
121
+
* Set up *access rules* for the service accounts that define how your Connectors can access and use the associated Kafka instance topics.
122
122
123
123
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
124
124
125
125
For this example, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
126
126
127
+
ifdef::qs[]
128
+
.Prerequisites
129
+
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
130
+
endif::[]
131
+
127
132
ifndef::qs[]
128
133
.Prerequisites
129
-
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
130
-
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
134
+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
135
+
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
131
136
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
132
137
endif::[]
133
138
134
139
.Procedure
135
140
. Create a Kafka topic for your connectors:
136
141
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
137
-
.. Click the name of the Kafka instance that you want to add a topic to.
142
+
.. Click the name of the {product-kafka} instance that you created for connectors.
138
143
.. Select the *Topics* tab, and then click *Create topic*.
139
-
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
144
+
.. Type a unique name for your topic. For example, type *test-topic* for *Topic Name*.
140
145
.. Accept the default settings for partitions, message retention, and replicas.
141
146
. Create a service account for connectors:
142
147
.. In the web console, select *Service Accounts*, and then click *Create service account*.
@@ -173,12 +178,12 @@ For this example, you create an instance of the *Data Generator* source connecto
173
178
174
179
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
175
180
176
-
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
181
+
The connector sends the messages at regular intervals to the Kafka topic that you created for connectors.
177
182
178
183
ifndef::qs[]
179
184
.Prerequisites
180
185
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
181
-
* You configured a Kafka instance for Connectors as described in _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
186
+
* You configured a {product-kafka} instance for connectors as described _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
182
187
183
188
endif::[]
184
189
@@ -192,9 +197,9 @@ For example, to find the *Data Generator* source connector, type *data* in the s
192
197
+
193
198
Click the card to select the connector, and then click *Next*.
194
199
195
-
. For the *Kafka instance*, click the card for the {product-kafka} instance that you configured for Connectors, and then click *Next*.
200
+
. For *Kafka instance*, click the card for the {product-kafka} instance that you configured for connectors, and then click *Next*.
196
201
+
197
-
NOTE: If you have not already configured a {product-kafka} instance for Connectors, you can create a new Kafka instance by clicking *Create Kafka instance*. You would also need to set up and define access for a service account as described in _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
202
+
NOTE: If you have not already configured a {product-kafka} instance for connectors, click *Create Kafka instance*. You also must set up and define access for a service account as described in _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
198
203
199
204
. On the *Namespace* page, click *Register eval namespace* to provision a namespace for hosting the Connectors instances that you create.
200
205
+
@@ -209,13 +214,13 @@ NOTE: If you have not already configured a {product-kafka} instance for Connecto
209
214
210
215
. Configure the core configuration for your connector:
211
216
.. Provide a name for the connector.
212
-
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
217
+
.. Type the *Client ID* and *Client Secret* of the service account that you created for connectors and then click *Next*.
213
218
214
219
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
215
220
.. *Data shape Format*: Accept the default, `application/octet-stream`.
216
-
.. *Topic Names*: Type the name of the topic that you created for Connectors. For example, type *test-topic*.
221
+
.. *Topic Names*: Type the name of the topic that you created for connectors. For example, type *test-topic*.
217
222
.. *Content Type*: Accept the default, `text/plain`.
218
-
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For example, type `Hello World!`.
223
+
.. *Message*: Type the content of the message that you want the Connectors instance to send to the Kafka topic. For example, type `Hello World!`.
219
224
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to send messages to the Kafka topic. For example, specify `10000`, to send a message every 10 seconds.
220
225
221
226
. Optionally, configure the error handling policy for your Connectors instance.
@@ -226,22 +231,25 @@ The options are:
226
231
* *log* - The Connectors instance sends errors to its log.
227
232
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
228
233
+
229
-
For example, select *log*.
234
+
For example, accept the default *stop* option.
230
235
231
236
. Click *Next*.
232
237
233
238
. Review the summary of the configuration properties and then click *Create Connectors instance*.
234
239
+
235
-
Your Connectors instance is listed in the table of Connectors. After a couple of seconds, the status of your Connectors instance changes to the *Ready* state and it starts producing messages and sending them to its associated Kafka topic.
240
+
Your Connectors instance is listed in the table of connectors. After a couple of seconds, the status of your Connectors instance changes to the *Ready* state and it starts producing messages and sending them to its associated Kafka topic.
236
241
+
237
242
From the connectors table, you can stop, start, and delete your Connectors instance, as well as edit its configuration, by clicking the options icon (three vertical dots).
238
243
239
244
.Verification
240
-
ifdef::qs[]
241
-
* Did you create an instance of the Data Generator connector?
242
-
endif::[]
243
245
244
-
In the next procedure, you can verify that the source Connectors instance is sending messages as expected by creating a sink Connectors instance that consumes the messages.
246
+
* Does your source Connectors instance generate messages?
247
+
248
+
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
249
+
.. Click the Kafka instance that you created for connectors.
250
+
.. Click the *Topics* tab and then click the topic that you specified for your source Connectors instance.
251
+
.. Click the *Messages* tab to see a list of `Hello World!` messages.
252
+
245
253
246
254
[id="proc-creating-sink-connector_{context}"]
247
255
== Creating a Connectors instance for a data sink
@@ -301,7 +309,8 @@ After a couple of seconds, the status of your Connectors instance changes to the
301
309
302
310
.Verification
303
311
304
-
Open the browser tab to your custom URL for the link:https://webhook.site[webhook.site^] to see the HTTP POST calls with the `"Hello World!!"` messages (that you defined in the source connector).
312
+
Open the browser tab to your custom URL for the link:https://webhook.site[webhook.site^].
313
+
Do you see HTTP POST calls with `"Hello World!!"` messages?
0 commit comments