Skip to content

Commit d604112

Browse files
authored
jc-1663 update connectors prereq (#602)
* jc-1663 update connectors prereq * jc-1663 edits for peer review * jc-1663 more edits for peer review * jc-1663 again more edits for peer review * jc-1663 update attributes and other fixes * jc-1663 fixing use of attributes * jc-1663 fixing extra spaces
1 parent a1121be commit d604112

File tree

26 files changed

+148
-37
lines changed

26 files changed

+148
-37
lines changed

docs/_artifacts/document-attributes.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/api-designer/getting-started-api-designer/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/connectors/getting-started-connectors/README.adoc

Lines changed: 50 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands
@@ -109,52 +113,61 @@ A *sink* connector allows you to send data from {product-kafka} to an external s
109113
====
110114
endif::[]
111115

112-
ifndef::qs[]
113-
== Overview
114116

115-
{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
117+
[id="proc-verifying-prerequisites-for-connectors_{context}"]
118+
== Verifying the prerequisites for using {product-long-connectors}
116119

117-
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
120+
[role="_abstract"]
118121

119-
The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
122+
Before you use {product-connectors}, you must complete the following prerequisites:
120123

121-
[.screencapture]
122-
.{product-long-connectors} data flow
123-
image::connectors-diagram.png[Illustration of data flow from data source through Kafka to data sink]
124+
* Determine which {openshift} environment to use for deploying your {product-connectors} instances.
124125

125-
endif::[]
126+
* Configure {product-long-kafka} for use with {product-connectors}.
126127

127-
[id="proc-configuring-kafka-for-connectors_{context}"]
128-
== Verifying that you have the prerequisites for using {product-long-connectors}
128+
*Determining which {openshift} environment to use for deploying your {connectors} instances*
129129

130-
[role="_abstract"]
131-
ifdef::qs[]
132-
Before you can use {product-connectors}, you must complete the link:https://console.redhat.com/application-services/learning-resources?quickstart=getting-started[Getting started with {product-long-kafka}] quick start to set up the following components:
130+
For Service Preview, you have two choices:
131+
132+
* *The hosted evaluation environment*
133+
134+
** The {connectors} instances are hosted on a multitenant {openshift-dedicated} cluster that is owned by Red Hat.
135+
** You can create four {connectors} instances at a time.
136+
** The evaluation environment applies 48-hour expiration windows, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/8190dc9e-249c-4207-bd69-096e5dd5bc64[Red Hat {openshift} {connectors} Service Preview evaluation guidelines^].
137+
138+
* *Your own trial environment*
139+
140+
** You have access to your own {openshift-dedicated} trial environment.
141+
** You can create an unlimited number of {connectors} instances.
142+
** Your {openshift-dedicated} trial cluster expires after 60 days.
143+
** A cluster administrator must install the {product-connectors} add-on as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat {openshift} {connectors} add-on to your {openshift-dedicated} trial cluster^].
144+
145+
*Configuring {product-long-kafka} for use with {product-connectors}*
133146

134-
* A *Kafka instance* that you can use for {product-connectors}.
135-
* A *Kafka topic* to store messages sent by data sources and make the messages available to data sinks.
136-
* A *service account* that allows you to connect and authenticate your {connectors} instances with your Kafka instance.
137-
* *Access rules* for the service account that defines how your {connectors} instances can access and use the topics in your Kafka instance.
138-
endif::[]
139147
ifndef::qs[]
140-
Before you can use {product-connectors}, you must complete the steps in _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_ to set up the following components:
148+
Complete the steps in _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_ to set up the following components:
149+
endif::[]
150+
151+
ifdef::qs[]
152+
Complete the steps in the link:https://console.redhat.com/application-services/learning-resources?quickstart=getting-started[Getting started with {product-long-kafka}] quick start to set up the following components:
153+
endif::[]
141154

142155
* A *Kafka instance* that you can use for {product-connectors}.
143156
* A *Kafka topic* to store messages sent by data sources and make the messages available to data sinks.
144157
* A *service account* that allows you to connect and authenticate your {connectors} instances with your Kafka instance.
145-
* *Access rules* for the service account that defines how your {connectors} instances can access and use the topics in your Kafka instance.
146-
endif::[]
158+
* *Access rules* for the service account that define how your {connectors} instances can access and use the topics in your Kafka instance.
147159

148160
ifdef::qs[]
149161
.Procedure
150162
Make sure that you have set up the prerequisite components.
151163

152164
.Verification
153-
* Is the Kafka instance listed in the Kafka instances table and is it in the *Ready* state?
154-
* Did you verify that your service account was successfully created in the *Service Accounts* page?
165+
* Is the Kafka instance listed in the Kafka instances table and is the Kafka instance in the *Ready* state?
166+
* Is your service account created in the *Service Accounts* page?
155167
* Did you save your service account credentials to a secure location?
156168
* Are the permissions for your service account listed in the *Access* page of the Kafka instance?
157-
* Is the Kafka topic that you created for {product-connectors} listed in the topics table of the Kafka instance?
169+
* Is the Kafka topic that you created for {connectors} listed in the topics table of the Kafka instance?
170+
* If you plan to use a 60-day {openshift-dedicated} trial cluster to deploy your {product-connectors} instances, has a cluster administrator added the {product-connectors} add-on to your trial cluster?
158171

159172
endif::[]
160173

@@ -165,6 +178,7 @@ ifndef::qs[]
165178
* Verify that you saved your service account credentials to a secure location.
166179
* Verify that the permissions for your service account are listed in the *Access* page of the Kafka instance.
167180
* Verify that the Kafka topic that you created for {product-connectors} is listed in the Kafka instance's topics table.
181+
* If you plan to use a 60-day {openshift-dedicated} trial cluster to deploy your {product-connectors} instances, verify that a cluster administrator added the {product-connectors} add-on to your trial cluster.
168182

169183
endif::[]
170184

@@ -187,7 +201,7 @@ ifndef::qs[]
187201
endif::[]
188202

189203
.Procedure
190-
. In the {product-long-connectors} web console, select *Connectors* and then click *Create {connectors} instance*.
204+
. In the {product-long-connectors} web console, select *{connectors}* and then click *Create {connectors} instance*.
191205
. Select the connector that you want to use for connecting to a data source.
192206
+
193207
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
@@ -198,11 +212,11 @@ Click the card to select the connector, and then click *Next*.
198212

199213
. For *Kafka instance*, click the card for the {product-kafka} instance that you configured for {connectors}, and then click *Next*.
200214

201-
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment. The namespace is the deployment space that hosts your {connectors} instances.
215+
. On the *Namespace* page, the namespace that you select depends on your {openshift-dedicated} environment. The namespace is the deployment space that hosts your {connectors} instances.
202216
+
203-
If you're using a trial cluster in your own OpenShift Dedicated environment, select the card for the namespace that was created when a system administrator added the {connectors} service to your trial cluster, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat OpenShift Connectors add-on to your OpenShift Dedicated trial cluster^].
217+
If you're using a trial cluster in your own {openshift-dedicated} environment, select the card for the namespace that was created when a system administrator added the {connectors} service to your trial cluster, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat {openshift} {connectors} add-on to your {openshift-dedicated} trial cluster^].
204218
+
205-
If you're using the evaluation OpenShift Dedicated environment, click *Register eval namespace* to provision a namespace for hosting the {connectors} instances that you create.
219+
If you're using the evaluation {openshift-dedicated} environment, click *Register eval namespace* to provision a namespace for hosting the {connectors} instances that you create.
206220

207221
. Click *Next*.
208222

@@ -239,10 +253,10 @@ ifdef::qs[]
239253
* Does your source {connectors} instance generate messages?
240254
endif::[]
241255
ifndef::qs[]
242-
* Verify that your source {connectors} instance generate messages.
256+
* Verify that your source {connectors} instance generates messages.
243257
endif::[]
244258

245-
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
259+
.. In the {product-long-rhoas} web console, select *Streams for Apache Kafka* > *Kafka Instances*.
246260
.. Click the Kafka instance that you created for connectors.
247261
.. Click the *Topics* tab and then click the topic that you specified for your source {connectors} instance.
248262
.. Click the *Messages* tab to see a list of `Hello World!` messages.
@@ -275,11 +289,11 @@ endif::[]
275289
+
276290
For example, select *test* and then click *Next*.
277291

278-
. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment. The namespace is the deployment space that hosts your {connectors} instances.
292+
. On the *Namespace* page, the namespace that you select depends on your {openshift-dedicated} environment. The namespace is the deployment space that hosts your {connectors} instances.
279293
+
280-
If you're using a trial cluster on your own OpenShift Dedicated environment, select the card for the namespace that was created when you added the {connectors} service to your trial cluster.
294+
If you're using a trial cluster on your own {openshift-dedicated} environment, select the card for the namespace that was created when you added the {connectors} service to your trial cluster.
281295
+
282-
If you're using the evaluation OpenShift Dedicated environment, click the *eval namespace* that you created when you created the source connector.
296+
If you're using the evaluation {openshift-dedicated} environment, click the *eval namespace* that you created when you created the source connector.
283297

284298
. Click *Next*.
285299

@@ -298,7 +312,7 @@ If you're using the evaluation OpenShift Dedicated environment, click the *eval
298312

299313
. Review the summary of the configuration properties and then click *Create {connectors} instance*.
300314
+
301-
Your {connectors} instance is listed in the table of Connectors.
315+
Your {connectors} instance is listed in the table of {connectors}.
302316
+
303317
After a couple of seconds, the status of your {connectors} instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).
304318

@@ -309,7 +323,7 @@ ifdef::qs[]
309323
endif::[]
310324

311325
ifndef::qs[]
312-
* Verify that you see HTTP POST calls with `"Hello World!!"` messages by opening a web browser tab to your custom URL for the link:https://webhook.site[webhook.site^].
326+
* Verify that you see HTTP POST calls with `"Hello World!!"` messages. Open a web browser tab to your custom URL for the link:https://webhook.site[webhook.site^].
313327
endif::[]
314328

315329

docs/connectors/getting-started-connectors/quickstart.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,9 +16,10 @@ spec:
1616
description: !snippet README.adoc#description
1717
prerequisites:
1818
- Complete the <a href="https://console.redhat.com/application-services/learning-resources?quickstart=getting-started">Getting started with OpenShift Streams for Apache Kafka</a> quick start.
19+
- If you plan to use a 60-day OpenShift Dedicated trial cluster to deploy your Connectors instances, a cluster administrator must install the Connectors add-on as described in <a href="https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01">Adding the Red Hat OpenShift Connectors add-on to your OpenShift Dedicated trial cluster</a>.
1920
introduction: !snippet README.adoc#introduction
2021
tasks:
21-
- !snippet/proc README.adoc#proc-configuring-kafka-for-connectors
22+
- !snippet/proc README.adoc#proc-verifying-prerequisites-for-connectors
2223
- !snippet/proc README.adoc#proc-creating-source-connector
2324
- !snippet/proc README.adoc#proc-creating-sink-connector
2425
conclusion: !snippet README.adoc#conclusion

docs/connectors/rhoas-cli-getting-started-connectors/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/kafka/access-mgmt-kafka/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/kafka/consumer-configuration-kafka/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/kafka/getting-started-kafka/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/kafka/kafka-bin-scripts-kafka/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

docs/kafka/kafka-instance-settings/README.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
1515
:cloud-console-url: https://console.redhat.com/
1616
:service-accounts-url: https://console.redhat.com/application-services/service-accounts
1717

18+
//to avoid typos
19+
:openshift: OpenShift
20+
:openshift-dedicated: OpenShift Dedicated
21+
1822
//OpenShift Application Services CLI
1923
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
2024
:command-ref-url-cli: commands

0 commit comments

Comments
 (0)