You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* JC-589 connectors qs update
* JC-589 more copyedits for quick start
* Update on > in
Co-authored-by: Ben Hardesty <[email protected]>
Co-authored-by: Ben Hardesty <[email protected]>
As a developer of applications and services, you can use {product-long-connectors} to create and configure connections between OpenShift Streams for Apache Kafka and third-party systems.
65
65
66
-
In this quick start example, you connect a data source (Data Generator) that generates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
66
+
In this quick start, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
67
67
68
68
// Condition out QS-only content so that it doesn't appear in docs.
69
69
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
@@ -99,11 +99,15 @@ endif::[]
99
99
== Configuring the {product-kafka} instance for use with {product-long-connectors}
100
100
101
101
[role="_abstract"]
102
-
Configure your {product-kafka} for use with {product-long-connectors} by creating Kafka topics, creating service accounts, and setting up access rules for the service accounts. The number of Kafka topics, service accounts that you create, and the access rules for the service accounts depend on your application.
102
+
Configure your {product-kafka} instance for use with {product-long-connectors} by:
103
103
104
-
Kakfa topics store messages sent by producers (data sources) and make them available to consumers (data sinks). Service accounts allow you to connect and authenticate your Connectors with Kafka instances. Access rules for service accounts define how your Connectors can access and use the associated Kafka instance topics.
104
+
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
105
+
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
106
+
* Setting up *access rules* for the service accounts that define how your Connectors can access and use the associated Kafka instance topics.
105
107
106
-
For this quick start, you create one Kafka topic, named *test*, one service account, and you provide all permissions on the service account.
108
+
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
109
+
110
+
For this quick start, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
107
111
108
112
ifndef::qs[]
109
113
.Prerequisites
@@ -113,20 +117,23 @@ For instructions on how to create a Kafka instance, see _{base-url}{getting-star
113
117
endif::[]
114
118
115
119
.Procedure
116
-
. Create a topic for connectors:
117
-
.. In the {service-url-connectors}[^] web console, select *Streams for Apache Kafka* > *Kafka Instances* and then click the name of the Kafka instance that you want to add a topic to.
120
+
. Create a Kakfa topic for your connectors:
121
+
.. In the {service-url-connectors}[^] web console, select *Streams for Apache Kafka* > *Kafka Instances*.
122
+
.. Click the name of the Kafka instance that you want to add a topic to.
118
123
.. Select the *Topics* tab, and then click *Create topic*.
119
-
.. Type a unique name for your topic. For the quick start example, type *test-topic* for the *Topic Name*. Accept the default settings for partitions and message retention, and replicas.
124
+
.. Type a unique name for your topic. For this quick start, type *test-topic* for the *Topic Name*.
125
+
.. Accept the default settings for partitions, message retention, and replicas.
120
126
. Create a service account for connectors:
121
127
.. In the web console, select *Service Accounts*, and then click *Create service account*.
122
128
.. Type a unique service account name (for example, *test-service-acct* ) and then click *Create*.
123
129
.. Copy the generated *Client ID* and *Client Secret* to a secure location. You'll use these credentials to configure connections to this service account.
124
130
.. Select the *I have copied the client ID and secret* option, and then click *Close*.
125
131
126
132
. Set the level of access for your new service account in the Access Control List (ACL) of the Kafka instance:
127
-
.. Select *Streams for Apache Kafka* > *Kafka Instances*, click the name of the Kafka instance that you want the service account to access.
128
-
.. Click the *Access* tab to view the current ACL for this instance and then click *Manage access*.
129
-
.. From the *Account* drop-down menu, select the service account that you previously created, and then click *Next*.
133
+
.. Select *Streams for Apache Kafka* > *Kafka Instances*.
134
+
.. Click the name of the Kafka instance that you want the service account to access.
135
+
.. Click the *Access* tab to view the current ACL for the Kakfa instance and then click *Manage access*.
136
+
.. From the *Account* drop-down menu, select the service account that you created in Step 2, and then click *Next*.
130
137
.. Under *Assign Permissions*, use the drop-down menu to select the *Consume from a topic* and the *Produce to a topic* permission options, and then set all resource identifiers to `is` and all identifier values to `"*"`.
131
138
+
132
139
The `is "*"` settings enable connectors that are configured with the service account credentials to produce and consume messages in any topic in the Kafka instance.
@@ -143,8 +150,13 @@ endif::[]
143
150
== Creating a Connectors instance for a data source
144
151
145
152
[role="_abstract"]
146
-
A source connector consumes events from an external data source and produces Kafka messages. For this quick start, you create an instance of the Data Generator Source connector. You configure the connector to listen for events from the data source and produce a Kafka message with a configurable payload for each event. The connector sends the messages at regular intervals to a Kafka topic on your {product-kafka} instance.
153
+
A *source* connector consumes events from an external data source and produces Kafka messages.
154
+
155
+
For this quick start, you create an instance of the *Data Generator* source connector.
147
156
157
+
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
158
+
159
+
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
148
160
149
161
ifndef::qs[]
150
162
.Prerequisites
@@ -155,38 +167,55 @@ endif::[]
155
167
156
168
.Procedure
157
169
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
158
-
. Select the connector that you want to use for a data source. You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
170
+
. Select the connector that you want to use for a data source.
171
+
+
172
+
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
159
173
+
160
-
For example, to find the source connector for this quick start, type *data* in the search box. The list filters to show only the *Data Generator Connector* card, which is the source connector for this quick start. Click the card to select the connector, and then click *Next*.
174
+
For example, to find the source connector for this quick start, type *data* in the search box. The list filters to show only the *Data Generator Connector* card, which is the source connector for this quick start.
175
+
+
176
+
Click the card to select the connector, and then click *Next*.
161
177
162
-
. Select the {product-kafka} instance that you configured for Connectors. Click the Kafka instance's card and then click *Next*.
178
+
. For the *Kafka instance*, click the card for the {product-kafka} instance that you configured for Connectors, and then click *Next*.
163
179
+
164
-
NOTE: If you have not already configured a {product-kafka} instance for Connectors, you can create a new Kafka instance by clicking *Create kafka instance*.
180
+
NOTE: If you have not already configured a {product-kafka} instance for Connectors, you can create a new Kafka instance by clicking *Create Kafka instance*. You would also need to set up and define access for a service account as described in _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
165
181
166
-
. Select the namespace to host your Connectors instance and then click *Next*.
182
+
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
183
+
+
184
+
If you are using a trial cluster in your own OpenShift Dedicated environment, select the card for the namespace that was created when you added the Connectors service to your trial cluster, as described in _https://access.redhat.com/documentation/en-us/red_hat_openshift_connectors/TBD[Adding the OpenShift Connectors service to an OpenShift Dedicated trial cluster^]_.
185
+
//need to update this link with correct URL
186
+
+
187
+
If you are using the evaluation OpenShift Dedicated environment, click *Register eval namespace* to provision a namespace for hosting the Connectors instances that you create.
188
+
189
+
. Click *Next*.
167
190
168
191
. Configure the core configuration for your connector:
169
192
.. Provide a unique name for the connector.
170
193
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
171
194
172
-
. Provide connector-specific configuration:
195
+
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
173
196
.. *Data shape Format*: Accept the default, `application/octet-stream`.
174
-
.. *Topic Names*: Type the name of the topic that you created for Connectors (for the quick start example, type *test-topic*).
197
+
.. *Topic Names*: Type the name of the topic that you created for Connectors. For this quick start, type *test-topic*.
175
198
.. *Content Type*: Accept the default, `text/plain`.
176
-
.. *Message*: Type the content of the message that you want to send to the topic, for the quick start example, type `Hello World!`.
177
-
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to sends messages to the Kafka topic. For the quick start example, specify `10000`, to send a message every 10 seconds.
199
+
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For this quick start, type `Hello World!`.
200
+
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to sends messages to the Kafka topic. For this quick start, specify `10000`, to send a message every 10 seconds.
178
201
179
-
. Optionally, configure the error handling policy for your Connectors instance. For the quick start, select *log* (the Connectors instance sends errors to its log).
202
+
. Optionally, configure the error handling policy for your Connectors instance.
203
+
+
204
+
The options are:
180
205
+
181
-
Other options are *stop* (the Connectors instance shuts down in case of errors), or *dead letter queue* (the Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance).
206
+
* *stop* - (the default) The Connectors instance shuts down when it encounters an error.
207
+
* *log* - The Connectors instance sends errors to its log.
208
+
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
209
+
+
210
+
For this quick start, select *log*.
182
211
183
212
. Click *Next*.
184
213
185
-
. Review the summary of the configuration properties of your Connectors instance and then click *Create Connectors instance* to deploy it.
186
-
214
+
. Review the summary of the configuration properties and then click *Create Connectors instance*.
215
+
+
187
216
Your Connectors instance is listed in the table of Connectors. After a couple of seconds, the status of your Connectors instance changes to the *Ready* state and it starts producing messages and sending them to its associated Kafka topic.
188
-
189
-
From the connectors table, you can stop, start and delete your Connectors instance, as well as edit its configuration by clicking the options icon (three vertical dots).
217
+
+
218
+
From the connectors table, you can stop, start, and delete your Connectors instance, as well as edit its configuration, by clicking the options icon (three vertical dots).
190
219
191
220
.Verification
192
221
ifdef::qs[]
@@ -199,43 +228,55 @@ In the next procedure, you can verify that the source Connectors instance is sen
199
228
== Creating a Connectors instance for a data sink
200
229
201
230
[role="_abstract"]
202
-
A sink connector consumes messages from a Kafka topic and sends them to an external system. In this quick start, you use the *HTTP Sink* connector which consumes the Kakfa messages (produced by the source Connectors instance) and sends the message payloads to an HTTP endpoint.
231
+
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
232
+
233
+
For this quick start, you use the *HTTP Sink* connector which consumes the Kakfa messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
203
234
204
235
ifndef::qs[]
205
236
.Prerequisites
206
237
* You're logged in to the web console at {service-url-connectors}[^].
207
238
* You created the source Connectors instance as described in _{base-url}{getting-started-url-conectors}/proc-creating-source-connector_getting-started-connectors[Creating a Connectors instance for a data source^]_.
208
-
* You have a unique URL from the https://webhook.site[webhook.site].
239
+
* For the data sink example in this quick start, open the free https://webhook.site[Webhook.site^] in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
209
240
endif::[]
210
241
211
-
212
242
.Procedure
213
243
214
244
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
215
245
216
246
. Select the sink connector that you want to use:
217
-
.. For this quick start, type *http* in the search field. The list of connectors filters to show one connector, called *HTTP Sink*, which is the sink connector to use for this quick start.
247
+
.. For this quick start, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector, which is the sink connector for this quick start.
218
248
.. Click the *HTTP Sink connector* card and then click *Next*.
219
249
220
-
. Select the {product-kafka} instance for the connector to work with. For the quick start, select *test* and then click *Next*.
250
+
. Select the {product-kafka} instance for the connector to work with.
251
+
+
252
+
For this quick start, select *test* and then click *Next*.
221
253
222
-
. Select the namespace to host your Connectors instance and then click *Next*.
254
+
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
255
+
+
256
+
If you are using a trial cluster on your own OpenShift Dedicated environment, select the card for the namespace that was created when you added the Connectors service to your trial cluster.
257
+
+
258
+
If you are using the evaluation OpenShift Dedicated environment, click the *eval namespace* that you created when you created the source connector.
223
259
224
-
. Configure the core configuration for your connector:
225
-
.. Provide a unique name for the connector.
260
+
. Click *Next*.
261
+
262
+
. Provide the core configuration for your connector:
263
+
.. Type a unique name for the connector.
226
264
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
227
265
228
-
. Provide connector-specific configuration:
266
+
. Provide the connector-specific configuration for your connector. For the *HTTP sink connector*, provide the following information:
267
+
229
268
.. *Data shape Format*: Accept the default, `application/octet-stream`.
230
269
.. *Method*: Accept the default, `POST`.
231
-
.. *URL*: Enter your unique URL from link:https://webhook.site[webhook.site^].
232
-
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance (for the quick start example, type *test-topic*).
270
+
.. *URL*: Type your unique URL from the link:https://webhook.site[webhook.site^].
271
+
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance. For this quick start, type *test-topic*.
233
272
234
-
. Optionally, configure the error handling policy for your Connectors instance. For the quick start, select *log* and then click *Next*.
273
+
. Optionally, configure the error handling policy for your Connectors instance. For this quick start, select *log* and then click *Next*.
235
274
236
-
. Review the summary of the configuration properties of your Connectors instance and then click *Create Connectors instance* to deploy it.
275
+
. Review the summary of the configuration properties and then click *Create Connectors instance*.
276
+
+
277
+
Your Connectors instance is listed in the table of Connectors.
237
278
+
238
-
Your Connectors instance is listed in the table of Connectors. After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for the quick start, the data sink is the HTTP URL that you provided).
279
+
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this quick start, the data sink is the HTTP URL that you provided).
Copy file name to clipboardExpand all lines: docs/connectors/getting-started-connectors/quickstart.yml
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -16,8 +16,8 @@ spec:
16
16
description: !snippet README.adoc#description
17
17
prerequisites:
18
18
- A Red Hat identity
19
-
- You've created a Kafka instance and the instance is in *Ready* state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20
-
- For the example data sink, open the free https://webhook.site[webhook.site]in a browser window. The `webhook.site` page provides a unique URL that you can copy for use as an HTTP data sink.
19
+
- You've created a Kafka instance and the instance is in `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20
+
- For the data sink example in this quick start, open the free Webhook.site (https://webhook.site) in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
0 commit comments