Skip to content

Commit b846b5a

Browse files
JC 589 (#455)
* JC-589 connectors qs update * JC-589 more copyedits for quick start * Update on > in Co-authored-by: Ben Hardesty <[email protected]> Co-authored-by: Ben Hardesty <[email protected]>
1 parent e93a66c commit b846b5a

File tree

2 files changed

+83
-42
lines changed

2 files changed

+83
-42
lines changed

docs/connectors/getting-started-connectors/README.adoc

Lines changed: 81 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ ifdef::context[:parent-context: {context}]
6363
[role="_abstract"]
6464
As a developer of applications and services, you can use {product-long-connectors} to create and configure connections between OpenShift Streams for Apache Kafka and third-party systems.
6565

66-
In this quick start example, you connect a data source (Data Generator) that generates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
66+
In this quick start, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
6767

6868
// Condition out QS-only content so that it doesn't appear in docs.
6969
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
@@ -99,11 +99,15 @@ endif::[]
9999
== Configuring the {product-kafka} instance for use with {product-long-connectors}
100100

101101
[role="_abstract"]
102-
Configure your {product-kafka} for use with {product-long-connectors} by creating Kafka topics, creating service accounts, and setting up access rules for the service accounts. The number of Kafka topics, service accounts that you create, and the access rules for the service accounts depend on your application.
102+
Configure your {product-kafka} instance for use with {product-long-connectors} by:
103103

104-
Kakfa topics store messages sent by producers (data sources) and make them available to consumers (data sinks). Service accounts allow you to connect and authenticate your Connectors with Kafka instances. Access rules for service accounts define how your Connectors can access and use the associated Kafka instance topics.
104+
* Creating *Kafka topics* to store messages sent by producers (data sources) and make them available to consumers (data sinks).
105+
* Creating *service accounts* that allow you to connect and authenticate your Connectors with Kafka instances.
106+
* Setting up *access rules* for the service accounts that define how your Connectors can access and use the associated Kafka instance topics.
105107

106-
For this quick start, you create one Kafka topic, named *test*, one service account, and you provide all permissions on the service account.
108+
The number of Kafka topics and service accounts that you create, and the access rules for the service accounts, depend on your application.
109+
110+
For this quick start, you create one Kafka topic, named *test*, one service account, and you define access for the service account.
107111

108112
ifndef::qs[]
109113
.Prerequisites
@@ -113,20 +117,23 @@ For instructions on how to create a Kafka instance, see _{base-url}{getting-star
113117
endif::[]
114118

115119
.Procedure
116-
. Create a topic for connectors:
117-
.. In the {service-url-connectors}[^] web console, select *Streams for Apache Kafka* > *Kafka Instances* and then click the name of the Kafka instance that you want to add a topic to.
120+
. Create a Kakfa topic for your connectors:
121+
.. In the {service-url-connectors}[^] web console, select *Streams for Apache Kafka* > *Kafka Instances*.
122+
.. Click the name of the Kafka instance that you want to add a topic to.
118123
.. Select the *Topics* tab, and then click *Create topic*.
119-
.. Type a unique name for your topic. For the quick start example, type *test-topic* for the *Topic Name*. Accept the default settings for partitions and message retention, and replicas.
124+
.. Type a unique name for your topic. For this quick start, type *test-topic* for the *Topic Name*.
125+
.. Accept the default settings for partitions, message retention, and replicas.
120126
. Create a service account for connectors:
121127
.. In the web console, select *Service Accounts*, and then click *Create service account*.
122128
.. Type a unique service account name (for example, *test-service-acct* ) and then click *Create*.
123129
.. Copy the generated *Client ID* and *Client Secret* to a secure location. You'll use these credentials to configure connections to this service account.
124130
.. Select the *I have copied the client ID and secret* option, and then click *Close*.
125131

126132
. Set the level of access for your new service account in the Access Control List (ACL) of the Kafka instance:
127-
.. Select *Streams for Apache Kafka* > *Kafka Instances*, click the name of the Kafka instance that you want the service account to access.
128-
.. Click the *Access* tab to view the current ACL for this instance and then click *Manage access*.
129-
.. From the *Account* drop-down menu, select the service account that you previously created, and then click *Next*.
133+
.. Select *Streams for Apache Kafka* > *Kafka Instances*.
134+
.. Click the name of the Kafka instance that you want the service account to access.
135+
.. Click the *Access* tab to view the current ACL for the Kakfa instance and then click *Manage access*.
136+
.. From the *Account* drop-down menu, select the service account that you created in Step 2, and then click *Next*.
130137
.. Under *Assign Permissions*, use the drop-down menu to select the *Consume from a topic* and the *Produce to a topic* permission options, and then set all resource identifiers to `is` and all identifier values to `"*"`.
131138
+
132139
The `is "*"` settings enable connectors that are configured with the service account credentials to produce and consume messages in any topic in the Kafka instance.
@@ -143,8 +150,13 @@ endif::[]
143150
== Creating a Connectors instance for a data source
144151

145152
[role="_abstract"]
146-
A source connector consumes events from an external data source and produces Kafka messages. For this quick start, you create an instance of the Data Generator Source connector. You configure the connector to listen for events from the data source and produce a Kafka message with a configurable payload for each event. The connector sends the messages at regular intervals to a Kafka topic on your {product-kafka} instance.
153+
A *source* connector consumes events from an external data source and produces Kafka messages.
154+
155+
For this quick start, you create an instance of the *Data Generator* source connector.
147156

157+
You configure your connector to listen for events from the data source and produce a Kafka message for each event.
158+
159+
The connector sends the messages at regular intervals to the Kafka topic that you created for Connectors.
148160

149161
ifndef::qs[]
150162
.Prerequisites
@@ -155,38 +167,55 @@ endif::[]
155167

156168
.Procedure
157169
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
158-
. Select the connector that you want to use for a data source. You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
170+
. Select the connector that you want to use for a data source.
171+
+
172+
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
159173
+
160-
For example, to find the source connector for this quick start, type *data* in the search box. The list filters to show only the *Data Generator Connector* card, which is the source connector for this quick start. Click the card to select the connector, and then click *Next*.
174+
For example, to find the source connector for this quick start, type *data* in the search box. The list filters to show only the *Data Generator Connector* card, which is the source connector for this quick start.
175+
+
176+
Click the card to select the connector, and then click *Next*.
161177

162-
. Select the {product-kafka} instance that you configured for Connectors. Click the Kafka instance's card and then click *Next*.
178+
. For the *Kafka instance*, click the card for the {product-kafka} instance that you configured for Connectors, and then click *Next*.
163179
+
164-
NOTE: If you have not already configured a {product-kafka} instance for Connectors, you can create a new Kafka instance by clicking *Create kafka instance*.
180+
NOTE: If you have not already configured a {product-kafka} instance for Connectors, you can create a new Kafka instance by clicking *Create Kafka instance*. You would also need to set up and define access for a service account as described in _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
165181

166-
. Select the namespace to host your Connectors instance and then click *Next*.
182+
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
183+
+
184+
If you are using a trial cluster in your own OpenShift Dedicated environment, select the card for the namespace that was created when you added the Connectors service to your trial cluster, as described in _https://access.redhat.com/documentation/en-us/red_hat_openshift_connectors/TBD[Adding the OpenShift Connectors service to an OpenShift Dedicated trial cluster^]_.
185+
//need to update this link with correct URL
186+
+
187+
If you are using the evaluation OpenShift Dedicated environment, click *Register eval namespace* to provision a namespace for hosting the Connectors instances that you create.
188+
189+
. Click *Next*.
167190

168191
. Configure the core configuration for your connector:
169192
.. Provide a unique name for the connector.
170193
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
171194

172-
. Provide connector-specific configuration:
195+
. Provide connector-specific configuration. For the *Data Generator*, provide the following information:
173196
.. *Data shape Format*: Accept the default, `application/octet-stream`.
174-
.. *Topic Names*: Type the name of the topic that you created for Connectors (for the quick start example, type *test-topic*).
197+
.. *Topic Names*: Type the name of the topic that you created for Connectors. For this quick start, type *test-topic*.
175198
.. *Content Type*: Accept the default, `text/plain`.
176-
.. *Message*: Type the content of the message that you want to send to the topic, for the quick start example, type `Hello World!`.
177-
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to sends messages to the Kafka topic. For the quick start example, specify `10000`, to send a message every 10 seconds.
199+
.. *Message*: Type the content of the message that you want the Connector instance to send to the Kafka topic. For this quick start, type `Hello World!`.
200+
.. *Period*: Specify the interval (in milliseconds) at which you want the Connectors instance to sends messages to the Kafka topic. For this quick start, specify `10000`, to send a message every 10 seconds.
178201

179-
. Optionally, configure the error handling policy for your Connectors instance. For the quick start, select *log* (the Connectors instance sends errors to its log).
202+
. Optionally, configure the error handling policy for your Connectors instance.
203+
+
204+
The options are:
180205
+
181-
Other options are *stop* (the Connectors instance shuts down in case of errors), or *dead letter queue* (the Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance).
206+
* *stop* - (the default) The Connectors instance shuts down when it encounters an error.
207+
* *log* - The Connectors instance sends errors to its log.
208+
* *dead letter queue* - The Connectors instance sends messages that it cannot handle to a dead letter topic that you define for the Connectors Kafka instance.
209+
+
210+
For this quick start, select *log*.
182211

183212
. Click *Next*.
184213

185-
. Review the summary of the configuration properties of your Connectors instance and then click *Create Connectors instance* to deploy it.
186-
214+
. Review the summary of the configuration properties and then click *Create Connectors instance*.
215+
+
187216
Your Connectors instance is listed in the table of Connectors. After a couple of seconds, the status of your Connectors instance changes to the *Ready* state and it starts producing messages and sending them to its associated Kafka topic.
188-
189-
From the connectors table, you can stop, start and delete your Connectors instance, as well as edit its configuration by clicking the options icon (three vertical dots).
217+
+
218+
From the connectors table, you can stop, start, and delete your Connectors instance, as well as edit its configuration, by clicking the options icon (three vertical dots).
190219

191220
.Verification
192221
ifdef::qs[]
@@ -199,43 +228,55 @@ In the next procedure, you can verify that the source Connectors instance is sen
199228
== Creating a Connectors instance for a data sink
200229

201230
[role="_abstract"]
202-
A sink connector consumes messages from a Kafka topic and sends them to an external system. In this quick start, you use the *HTTP Sink* connector which consumes the Kakfa messages (produced by the source Connectors instance) and sends the message payloads to an HTTP endpoint.
231+
A *sink* connector consumes messages from a Kafka topic and sends them to an external system.
232+
233+
For this quick start, you use the *HTTP Sink* connector which consumes the Kakfa messages (produced by the source Connectors instance) and sends the messages to an HTTP endpoint.
203234

204235
ifndef::qs[]
205236
.Prerequisites
206237
* You're logged in to the web console at {service-url-connectors}[^].
207238
* You created the source Connectors instance as described in _{base-url}{getting-started-url-conectors}/proc-creating-source-connector_getting-started-connectors[Creating a Connectors instance for a data source^]_.
208-
* You have a unique URL from the https://webhook.site[webhook.site].
239+
* For the data sink example in this quick start, open the free https://webhook.site[Webhook.site^] in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
209240
endif::[]
210241

211-
212242
.Procedure
213243

214244
. In the {service-url-connectors}[^] web console, select *Connectors* and then click *Create Connectors instance*.
215245

216246
. Select the sink connector that you want to use:
217-
.. For this quick start, type *http* in the search field. The list of connectors filters to show one connector, called *HTTP Sink*, which is the sink connector to use for this quick start.
247+
.. For this quick start, type *http* in the search field. The list of connectors filters to show the *HTTP Sink* connector, which is the sink connector for this quick start.
218248
.. Click the *HTTP Sink connector* card and then click *Next*.
219249

220-
. Select the {product-kafka} instance for the connector to work with. For the quick start, select *test* and then click *Next*.
250+
. Select the {product-kafka} instance for the connector to work with.
251+
+
252+
For this quick start, select *test* and then click *Next*.
221253

222-
. Select the namespace to host your Connectors instance and then click *Next*.
254+
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment.
255+
+
256+
If you are using a trial cluster on your own OpenShift Dedicated environment, select the card for the namespace that was created when you added the Connectors service to your trial cluster.
257+
+
258+
If you are using the evaluation OpenShift Dedicated environment, click the *eval namespace* that you created when you created the source connector.
223259

224-
. Configure the core configuration for your connector:
225-
.. Provide a unique name for the connector.
260+
. Click *Next*.
261+
262+
. Provide the core configuration for your connector:
263+
.. Type a unique name for the connector.
226264
.. Type the *Client ID* and *Client Secret* of the service account that you created for Connectors and then click *Next*.
227265

228-
. Provide connector-specific configuration:
266+
. Provide the connector-specific configuration for your connector. For the *HTTP sink connector*, provide the following information:
267+
229268
.. *Data shape Format*: Accept the default, `application/octet-stream`.
230269
.. *Method*: Accept the default, `POST`.
231-
.. *URL*: Enter your unique URL from link:https://webhook.site[webhook.site^].
232-
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance (for the quick start example, type *test-topic*).
270+
.. *URL*: Type your unique URL from the link:https://webhook.site[webhook.site^].
271+
.. *Topic Names*: Type the name of the topic that you used for the source Connectors instance. For this quick start, type *test-topic*.
233272

234-
. Optionally, configure the error handling policy for your Connectors instance. For the quick start, select *log* and then click *Next*.
273+
. Optionally, configure the error handling policy for your Connectors instance. For this quick start, select *log* and then click *Next*.
235274

236-
. Review the summary of the configuration properties of your Connectors instance and then click *Create Connectors instance* to deploy it.
275+
. Review the summary of the configuration properties and then click *Create Connectors instance*.
276+
+
277+
Your Connectors instance is listed in the table of Connectors.
237278
+
238-
Your Connectors instance is listed in the table of Connectors. After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for the quick start, the data sink is the HTTP URL that you provided).
279+
After a couple of seconds, the status of your Connectors instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this quick start, the data sink is the HTTP URL that you provided).
239280

240281
.Verification
241282

docs/connectors/getting-started-connectors/quickstart.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,8 @@ spec:
1616
description: !snippet README.adoc#description
1717
prerequisites:
1818
- A Red Hat identity
19-
- You've created a Kafka instance and the instance is in *Ready* state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20-
- For the example data sink, open the free https://webhook.site[webhook.site]in a browser window. The `webhook.site` page provides a unique URL that you can copy for use as an HTTP data sink.
19+
- You've created a Kafka instance and the instance is in `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20+
- For the data sink example in this quick start, open the free Webhook.site (https://webhook.site) in a browser window. The Webhook.site page provides a unique URL that you copy for use as an HTTP data sink.
2121
introduction: !snippet README.adoc#introduction
2222
tasks:
2323
- !snippet/proc README.adoc#proc-configuring-kafka-for-connectors

0 commit comments

Comments
 (0)