Skip to content

Commit e6273ee

Browse files
authored
JC-590 edits to Connectors qs (#463)
* JC-590 edits to Connectors qs * JC-590 set qs to draft
1 parent b846b5a commit e6273ee

File tree

2 files changed

+31
-28
lines changed

2 files changed

+31
-28
lines changed

docs/connectors/getting-started-connectors/README.adoc

Lines changed: 29 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ ifdef::context[:parent-context: {context}]
6363
[role="_abstract"]
6464
As a developer of applications and services, you can use {product-long-connectors} to create and configure connections between OpenShift Streams for Apache Kafka and third-party systems.
6565

66-
In this quick start, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
66+
In this example, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
6767

6868
// Condition out QS-only content so that it doesn't appear in docs.
6969
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
@@ -87,7 +87,7 @@ ifndef::qs[]
8787

8888
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
8989

90-
The following diagram illustrates how data flows from a source of data through a data source connector to a Kafka topic. And how data flows from a kafka topic to a data sink connector to a data sink.
90+
The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
9191

9292
[.screencapture]
9393
.{product-long-connectors} data flow
@@ -107,21 +107,21 @@ Configure your {product-kafka} instance for use with {product-long-connectors} b
107107

108108
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
109109

110-
For this quick start, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
110+
For this example, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
111111

112112
ifndef::qs[]
113113
.Prerequisites
114-
* You're logged in to the web console at {service-url-connectors}[^].
114+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
115115
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
116116
For instructions on how to create a Kafka instance, see _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_.
117117
endif::[]
118118

119119
.Procedure
120-
. Create a Kakfa topic for your connectors:
121-
.. In the {service-url-connectors}[^] web console, select *Streams for Apache Kafka* > *Kafka Instances*.
120+
. Create a Kafka topic for your connectors:
121+
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
122122
.. Click the name of the Kafka instance that you want to add a topic to.
123123
.. Select the *Topics* tab, and then click *Create topic*.
124-
.. Type a unique name for your topic. For this quick start, type *test-topic* for the *Topic Name*.
124+
.. Type a unique name for your topic. For example, type *test-topic* for the *Topic Name*.
125125
.. Accept the default settings for partitions, message retention, and replicas.
126126
. Create a service account for connectors:
127127
.. In the web console, select *Service Accounts*, and then click *Create service account*.
@@ -132,9 +132,11 @@ endif::[]
132132
. Set the level of access for your new service account in the Access Control List (ACL) of the Kafka instance:
133133
.. Select *Streams for Apache Kafka* > *Kafka Instances*.
134134
.. Click the name of the Kafka instance that you want the service account to access.
135-
.. Click the *Access* tab to view the current ACL for the Kakfa instance and then click *Manage access*.
135+
.. Click the *Access* tab to view the current ACL for the Kafka instance and then click *Manage access*.
136136
.. From the *Account* drop-down menu, select the service account that you created in Step 2, and then click *Next*.
137-
.. Under *Assign Permissions*, use the drop-down menu to select the *Consume from a topic* and the *Produce to a topic* permission options, and then set all resource identifiers to `is` and all identifier values to `"*"`.
137+
.. Under *Assign Permissions*, click *Add permission*.
138+
.. From the drop-down menu, select *Consume from a topic*. Set all resource identifiers to `is` and all identifier values to `"*"`.
139+
.. From the *Add permission* drop-down menu, select *Produce to a topic*. Set all resource identifiers to `is` and all identifier values to `"*"`.
138140
+
139141
The `is "*"` settings enable connectors that are configured with the service account credentials to produce and consume messages in any topic in the Kafka instance.
140142

@@ -152,26 +154,26 @@ endif::[]
152154
[role="_abstract"]
153155
A *source* connector consumes events from an external data source and produces Kafka messages.
154156

155-
For this quick start, you create an instance of the *Data Generator* source connector.
157+
For this example, you create an instance of the *Data Generator* source connector.
156158

157159
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
158160

159161
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
160162

161163
ifndef::qs[]
162164
.Prerequisites
163-
* You're logged in to the {service-url-connectors}[^] web console.
165+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
164166
* You configured a Kafka instance for Connectors as described in _{base-url}{getting-started-url-conectors}/proc-configuring-kafka-for-connectors_getting-started-connectors[Configuring the {product-kafka} instance for use with {product-long-connectors}^]_.
165167

166168
endif::[]
167169

168170
.Procedure
169-
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
171+
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
170172
. Select the connector that you want to use for a data source.
171173
+
172174
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
173175
+
174-
For example, to find the source connector for this quick start, type *data* in the search box. The list filters to show only the *Data Generator Connector* card, which is the source connector for this quick start.
176+
For example, to find the *Data Generator* source connector, type *data* in the search box. The list filters to show only the *Data Generator Connector* card.
175177
+
176178
Click the card to select the connector, and then click *Next*.
177179

@@ -189,15 +191,15 @@ If you are using the evaluation OpenShift Dedicated environment, click *Register
189191
. Click *Next*.
190192

191193
. Configure the core configuration for your connector:
192-
.. Provide a unique name for the connector.
194+
.. Provide a name for the connector.
193195
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
194196

195197
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
196198
.. *Data shape Format*: Accept the default, `application/octet-stream`.
197-
.. *Topic Names*: Type the name of the topic that you created for Connectors. For this quick start, type *test-topic*.
199+
.. *Topic Names*: Type the name of the topic that you created for Connectors. For example, type *test-topic*.
198200
.. *Content Type*: Accept the default, `text/plain`.
199-
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For this quick start, type `Hello World!`.
200-
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to sends messages to the Kafka topic. For this quick start, specify `10000`, to send a message every 10 seconds.
201+
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For example, type `Hello World!`.
202+
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to send messages to the Kafka topic. For example, specify `10000`, to send a message every 10 seconds.
201203

202204
. Optionally, configure the error handling policy for your Connectors instance.
203205
+
@@ -207,7 +209,7 @@ The options are:
207209
* *log* - The Connectors instance sends errors to its log.
208210
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
209211
+
210-
For this quick start, select *log*.
212+
For example, select *log*.
211213

212214
. Click *Next*.
213215

@@ -230,26 +232,26 @@ In the next procedure, you can verify that the source Connectors instance is sen
230232
[role="_abstract"]
231233
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
232234

233-
For this quick start, you use the *HTTP Sink* connector which consumes the Kakfa messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
235+
For this example, you use the *HTTP Sink* connector which consumes the Kafka messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
234236

235237
ifndef::qs[]
236238
.Prerequisites
237-
* You're logged in to the web console at {service-url-connectors}[^].
239+
* You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
238240
* You created the source Connectors instance as described in _{base-url}{getting-started-url-conectors}/proc-creating-source-connector_getting-started-connectors[Creating a Connectors instance for a data source^]_.
239-
* For the data sink example in this quick start, open the free https://webhook.site[Webhook.site^] in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
241+
* For the data sink example, open the free https://webhook.site[Webhook.site^] in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
240242
endif::[]
241243

242244
.Procedure
243245

244-
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
246+
. In the OpenShift Application Services web console, select *Connectors* and then click *Create Connectors instance*.
245247

246248
. Select the sink connector that you want to use:
247-
.. For this quick start, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector, which is the sink connector for this quick start.
249+
.. For example, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector.
248250
.. Click the *HTTP Sink connector* card and then click *Next*.
249251

250252
. Select the {product-kafka} instance for the connector to work with.
251253
+
252-
For this quick start, select *test* and then click *Next*.
254+
For example, select *test* and then click *Next*.
253255

254256
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
255257
+
@@ -268,15 +270,15 @@ If you are using the evaluation OpenShift Dedicated environment, click the *eval
268270
.. *Data shape Format*: Accept the default, `application/octet-stream`.
269271
.. *Method*: Accept the default, `POST`.
270272
.. *URL*: Type your unique URL from the link:https://webhook.site[webhook.site^].
271-
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance. For this quick start, type *test-topic*.
273+
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance. For example, type *test-topic*.
272274

273-
. Optionally, configure the error handling policy for your Connectors instance. For this quick start, select *log* and then click *Next*.
275+
. Optionally, configure the error handling policy for your Connectors instance. For example, select *log* and then click *Next*.
274276

275277
. Review the summary of the configuration properties and then click *Create Connectors instance*.
276278
+
277279
Your Connectors instance is listed in the table of Connectors.
278280
+
279-
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this quick start, the data sink is the HTTP URL that you provided).
281+
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).
280282

281283
.Verification
282284

docs/connectors/getting-started-connectors/quickstart.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@ kind: QuickStarts
33
metadata:
44
name: connectors-getting-started
55
annotations:
6+
draft: true
67
order: 6
78
spec:
89
version: 0.1
@@ -17,7 +18,7 @@ spec:
1718
prerequisites:
1819
- A Red Hat identity
1920
- You've created a Kafka instance and the instance is in `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20-
- For the data sink example in this quick start, open the free Webhook.site (https://webhook.site) in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
21+
- For the data sink example, open the free Webhook.site (https://webhook.site) in a browser window. The Webhook.site page provides a unique URL that you copy. You use this URL as an HTTP data sink.
2122
introduction: !snippet README.adoc#introduction
2223
tasks:
2324
- !snippet/proc README.adoc#proc-configuring-kafka-for-connectors

0 commit comments

Comments
 (0)