You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a developer of applications and services, you can use {product-long-connectors} to create and configure connections between OpenShift Streams for Apache Kafka and third-party systems.
65
65
66
-
In this quick start, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
66
+
In this example, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
67
67
68
68
// Condition out QS-only content so that it doesn't appear in docs.
69
69
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
@@ -87,7 +87,7 @@ ifndef::qs[]
87
87
88
88
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
89
89
90
-
The following diagram illustrates how data flows from a source of data through a data source connector to a Kafka topic. And how data flows from a kafka topic to a data sink connector to a data sink.
90
+
The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
91
91
92
92
[.screencapture]
93
93
.{product-long-connectors} data flow
@@ -107,21 +107,21 @@ Configure your {product-kafka} instance for use with {product-long-connectors} b
107
107
108
108
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
109
109
110
-
For this quick start, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
110
+
For this example, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
111
111
112
112
ifndef::qs[]
113
113
.Prerequisites
114
-
* You're logged in to the web console at {service-url-connectors}[^].
114
+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
115
115
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
116
116
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
117
117
endif::[]
118
118
119
119
.Procedure
120
-
. Create a Kakfa topic for your connectors:
121
-
.. In the {service-url-connectors}[^] web console, select *Streams for Apache Kafka* > *Kafka Instances*.
120
+
. Create a Kafka topic for your connectors:
121
+
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
122
122
.. Click the name of the Kafka instance that you want to add a topic to.
123
123
.. Select the *Topics* tab, and then click *Create topic*.
124
-
.. Type a unique name for your topic. For this quick start, type *test-topic* for the *Topic Name*.
124
+
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
125
125
.. Accept the default settings for partitions, message retention, and replicas.
126
126
. Create a service account for connectors:
127
127
.. In the web console, select *Service Accounts*, and then click *Create service account*.
@@ -132,9 +132,11 @@ endif::[]
132
132
. Set the level of access for your new service account in the Access Control List (ACL) of the Kafka instance:
133
133
.. Select *Streams for Apache Kafka* > *Kafka Instances*.
134
134
.. Click the name of the Kafka instance that you want the service account to access.
135
-
.. Click the *Access* tab to view the current ACL for the Kakfa instance and then click *Manage access*.
135
+
.. Click the *Access* tab to view the current ACL for the Kafka instance and then click *Manage access*.
136
136
.. From the *Account* drop-down menu, select the service account that you created in Step 2, and then click *Next*.
137
-
.. Under *Assign Permissions*, use the drop-down menu to select the *Consume from a topic* and the *Produce to a topic* permission options, and then set all resource identifiers to `is` and all identifier values to `"*"`.
137
+
.. Under *Assign Permissions*, click *Add permission*.
138
+
.. From the drop-down menu, select *Consume from a topic*. Set all resource identifiers to `is` and all identifier values to `"*"`.
139
+
.. From the *Add permission* drop-down menu, select *Produce to a topic*. Set all resource identifiers to `is` and all identifier values to `"*"`.
138
140
+
139
141
The `is "*"` settings enable connectors that are configured with the service account credentials to produce and consume messages in any topic in the Kafka instance.
140
142
@@ -152,26 +154,26 @@ endif::[]
152
154
[role="_abstract"]
153
155
A *source* connector consumes events from an external data source and produces Kafka messages.
154
156
155
-
For this quick start, you create an instance of the *Data Generator* source connector.
157
+
For this example, you create an instance of the *Data Generator* source connector.
156
158
157
159
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
158
160
159
161
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
160
162
161
163
ifndef::qs[]
162
164
.Prerequisites
163
-
* You're logged in to the {service-url-connectors}[^] web console.
165
+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
164
166
* You configured a Kafka instance for Connectors as described in _{base-url}{getting-started-url-conectors}/proc-configuring-kafka-for-connectors_getting-started-connectors[Configuring the {product-kafka} instance for use with {product-long-connectors}^]_.
165
167
166
168
endif::[]
167
169
168
170
.Procedure
169
-
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
171
+
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
170
172
. Select the connector that you want to use for a data source.
171
173
+
172
174
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
173
175
+
174
-
For example, to find the source connector for this quick start, type *data* in the search box. The list filters to show only the *Data Generator Connector* card, which is the source connector for this quick start.
176
+
For example, to find the *Data Generator* source connector, type *data* in the search box. The list filters to show only the *Data Generator Connector* card.
175
177
+
176
178
Click the card to select the connector, and then click *Next*.
177
179
@@ -189,15 +191,15 @@ If you are using the evaluation OpenShift Dedicated environment, click *Register
189
191
. Click *Next*.
190
192
191
193
. Configure the core configuration for your connector:
192
-
.. Provide a unique name for the connector.
194
+
.. Provide a name for the connector.
193
195
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
194
196
195
197
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
196
198
.. *Data shape Format*: Accept the default, `application/octet-stream`.
197
-
.. *Topic Names*: Type the name of the topic that you created for Connectors. For this quick start, type *test-topic*.
199
+
.. *Topic Names*: Type the name of the topic that you created for Connectors. For example, type *test-topic*.
198
200
.. *Content Type*: Accept the default, `text/plain`.
199
-
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For this quick start, type `Hello World!`.
200
-
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to sends messages to the Kafka topic. For this quick start, specify `10000`, to send a message every 10 seconds.
201
+
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For example, type `Hello World!`.
202
+
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to send messages to the Kafka topic. For example, specify `10000`, to send a message every 10 seconds.
201
203
202
204
. Optionally, configure the error handling policy for your Connectors instance.
203
205
+
@@ -207,7 +209,7 @@ The options are:
207
209
* *log* - The Connectors instance sends errors to its log.
208
210
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
209
211
+
210
-
For this quick start, select *log*.
212
+
For example, select *log*.
211
213
212
214
. Click *Next*.
213
215
@@ -230,26 +232,26 @@ In the next procedure, you can verify that the source Connectors instance is sen
230
232
[role="_abstract"]
231
233
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
232
234
233
-
For this quick start, you use the *HTTP Sink* connector which consumes the Kakfa messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
235
+
For this example, you use the *HTTP Sink* connector which consumes the Kafka messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
234
236
235
237
ifndef::qs[]
236
238
.Prerequisites
237
-
* You're logged in to the web console at {service-url-connectors}[^].
239
+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
238
240
* You created the source Connectors instance as described in _{base-url}{getting-started-url-conectors}/proc-creating-source-connector_getting-started-connectors[Creating a Connectors instance for a data source^]_.
239
-
* For the data sink example in this quick start, open the free https://webhook.site[Webhook.site^] in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
241
+
* For the data sink example, open the free https://webhook.site[Webhook.site^] in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
240
242
endif::[]
241
243
242
244
.Procedure
243
245
244
-
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
246
+
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
245
247
246
248
. Select the sink connector that you want to use:
247
-
.. For this quick start, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector, which is the sink connector for this quick start.
249
+
.. For example, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector.
248
250
.. Click the *HTTP Sink connector* card and then click *Next*.
249
251
250
252
. Select the {product-kafka} instance for the connector to work with.
251
253
+
252
-
For this quick start, select *test* and then click *Next*.
254
+
For example, select *test* and then click *Next*.
253
255
254
256
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
255
257
+
@@ -268,15 +270,15 @@ If you are using the evaluation OpenShift Dedicated environment, click the *eval
268
270
.. *Data shape Format*: Accept the default, `application/octet-stream`.
269
271
.. *Method*: Accept the default, `POST`.
270
272
.. *URL*: Type your unique URL from the link:https://webhook.site[webhook.site^].
271
-
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance. For this quick start, type *test-topic*.
273
+
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance. For example, type *test-topic*.
272
274
273
-
. Optionally, configure the error handling policy for your Connectors instance. For this quick start, select *log* and then click *Next*.
275
+
. Optionally, configure the error handling policy for your Connectors instance. For example, select *log* and then click *Next*.
274
276
275
277
. Review the summary of the configuration properties and then click *Create Connectors instance*.
276
278
+
277
279
Your Connectors instance is listed in the table of Connectors.
278
280
+
279
-
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this quick start, the data sink is the HTTP URL that you provided).
281
+
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).
Copy file name to clipboardExpand all lines: docs/connectors/getting-started-connectors/quickstart.yml
+2-1Lines changed: 2 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -3,6 +3,7 @@ kind: QuickStarts
3
3
metadata:
4
4
name: connectors-getting-started
5
5
annotations:
6
+
draft: true
6
7
order: 6
7
8
spec:
8
9
version: 0.1
@@ -17,7 +18,7 @@ spec:
17
18
prerequisites:
18
19
- A Red Hat identity
19
20
- You've created a Kafka instance and the instance is in `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20
-
- For the data sink example in this quick start, open the free Webhook.site (https://webhook.site) in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
21
+
- For the data sink example, open the free Webhook.site (https://webhook.site) in a browser window. The Webhook.site page provides a unique URL that you copy. You use this URL as an HTTP data sink.
0 commit comments