|
| 1 | +//// |
| 2 | +START GENERATED ATTRIBUTES |
| 3 | +WARNING: This content is generated by running npm --prefix .build run generate:attributes |
| 4 | +//// |
| 5 | + |
| 6 | +//OpenShift Application Services |
| 7 | +:org-name: Application Services |
| 8 | +:product-long-rhoas: OpenShift Application Services |
| 9 | +:community: |
| 10 | +:imagesdir: ./images |
| 11 | +:property-file-name: app-services.properties |
| 12 | +:samples-git-repo: https://github.com/redhat-developer/app-services-guides |
| 13 | +:base-url: https://github.com/redhat-developer/app-services-guides/tree/main/docs/ |
| 14 | + |
| 15 | +//OpenShift Application Services CLI |
| 16 | +:rhoas-cli-base-url: https://github.com/redhat-developer/app-services-cli/tree/main/docs/ |
| 17 | +:rhoas-cli-ref-url: commands |
| 18 | +:rhoas-cli-installation-url: rhoas/rhoas-cli-installation/README.adoc |
| 19 | + |
| 20 | +//OpenShift Streams for Apache Kafka |
| 21 | +:product-long-kafka: OpenShift Streams for Apache Kafka |
| 22 | +:product-kafka: Streams for Apache Kafka |
| 23 | +:product-version-kafka: 1 |
| 24 | +:service-url-kafka: https://console.redhat.com/application-services/streams/ |
| 25 | +:getting-started-url-kafka: kafka/getting-started-kafka/README.adoc |
| 26 | +:kafka-bin-scripts-url-kafka: kafka/kafka-bin-scripts-kafka/README.adoc |
| 27 | +:kafkacat-url-kafka: kafka/kcat-kafka/README.adoc |
| 28 | +:quarkus-url-kafka: kafka/quarkus-kafka/README.adoc |
| 29 | +:nodejs-url-kafka: kafka/nodejs-kafka/README.adoc |
| 30 | +:rhoas-cli-getting-started-url-kafka: kafka/rhoas-cli-getting-started-kafka/README.adoc |
| 31 | +:topic-config-url-kafka: kafka/topic-configuration-kafka/README.adoc |
| 32 | +:consumer-config-url-kafka: kafka/consumer-configuration-kafka/README.adoc |
| 33 | +:access-mgmt-url-kafka: kafka/access-mgmt-kafka/README.adoc |
| 34 | +:metrics-monitoring-url-kafka: kafka/metrics-monitoring-kafka/README.adoc |
| 35 | +:service-binding-url-kafka: kafka/service-binding-kafka/README.adoc |
| 36 | + |
| 37 | +//OpenShift Service Registry |
| 38 | +:product-long-registry: OpenShift Service Registry |
| 39 | +:product-registry: Service Registry |
| 40 | +:registry: Service Registry |
| 41 | +:product-version-registry: 1 |
| 42 | +:service-url-registry: https://console.redhat.com/application-services/service-registry/ |
| 43 | +:getting-started-url-registry: registry/getting-started-registry/README.adoc |
| 44 | +:quarkus-url-registry: registry/quarkus-registry/README.adoc |
| 45 | +:rhoas-cli-getting-started-url-registry: registry/rhoas-cli-getting-started-registry/README.adoc |
| 46 | +:access-mgmt-url-registry: registry/access-mgmt-registry/README.adoc |
| 47 | +:content-rules-registry: https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/9b0fdf14-f0d6-4d7f-8637-3ac9e2069817[Supported Service Registry content and rules] |
| 48 | + |
| 49 | +//OpenShift Connectors |
| 50 | +:product-long-connectors: OpenShift Connectors |
| 51 | +:service-url-connectors: https://console.redhat.com/application-services/connectors |
| 52 | +//// |
| 53 | +END GENERATED ATTRIBUTES |
| 54 | +//// |
| 55 | + |
| 56 | +[id="chap-getting-started-connectors"] |
| 57 | += Getting started with {product-long-connectors} |
| 58 | +ifdef::context[:parent-context: {context}] |
| 59 | +:context: getting-started-connectors |
| 60 | + |
| 61 | +// Purpose statement for the assembly |
| 62 | +[role="_abstract"] |
| 63 | +As a developer of applications and services, you can use {product-long-connectors} to ... |
| 64 | + |
| 65 | +// Condition out QS-only content so that it doesn't appear in docs. |
| 66 | +// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files. |
| 67 | +ifdef::qs[] |
| 68 | +[#description] |
| 69 | +Learn how to create and set up connectors in {product-long-connectors}. |
| 70 | + |
| 71 | +[#introduction] |
| 72 | +Welcome to the quick start for {product-long-connectors}. In this quick start, you'll learn how to create a source connector and sink connector and send data to and from {product-kafka}. A source connector allows you to send data from an external system to {product-kafka}. A sink connector allows you to send data from {product-kafka} to an external system. |
| 73 | + |
| 74 | +endif::[] |
| 75 | + |
| 76 | +[id="proc-configuring-kafka-for-connectors_{context}"] |
| 77 | +== Configuring the {product-kafka} instance for use with {product-long-connectors} |
| 78 | + |
| 79 | +[role="_abstract"] |
| 80 | +In this step you configure your {product-kafka} for use with {product-long-connectors}. This involves creating topics and setting up access rules for service accounts. |
| 81 | + |
| 82 | +.Procedure |
| 83 | +. First step |
| 84 | +. Second step |
| 85 | + |
| 86 | +.Verification |
| 87 | +ifdef::qs[] |
| 88 | +* Have you completed these steps? |
| 89 | +endif::[] |
| 90 | + |
| 91 | +[id="proc-creating-source-connector_{context}"] |
| 92 | +== Creating a {product-long-connectors} Source Connector |
| 93 | + |
| 94 | +[role="_abstract"] |
| 95 | +In this step you create and configure a source connector. A source connector consumes events from an external system and produces Kafka messages. In this quick start you use the Data Generator Source connector which will produce a Kafka message with a configurable payload at regular intervals to a Kafka topic on your {product-kafka} instance. |
| 96 | + |
| 97 | +.Procedure |
| 98 | +. In the {service-url-connectors}[^] web console, go to *Connectors* and click *Create connector instance*. Follow the guided steps to define the connector details. Click *Next* to complete each step and click *Finish* to finish the setup. |
| 99 | +. In the first step you select the connector you want to use. You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors. |
| 100 | ++ |
| 101 | +Enter *data* in the search box. You should see only one connector, called *Data Generator Connector*, which is the source connector you use in this quick start. Click the connector box to select the connector, and click *Next* to complete the step. |
| 102 | + |
| 103 | +. Select the {product-kafka} instance for the connector to work with. The name of the instance corresponds to the Kafka instance you configured in the previous step. Click on the Kafka instance box to select the instance and click *Next* to complete the step. |
| 104 | ++ |
| 105 | +NOTE: From this screen you can also create a new Kafka instance by clicking the *Create kafka instance* button. |
| 106 | + |
| 107 | +. Select the OpenShift Dedicated cluster to host the connector instance. Select the OSD cluster by clicking the box representing the cluster, and click *Next* to complete the step. |
| 108 | + |
| 109 | +. In this step you configure the common configuration for your connector. Provide a unique name for the connector and select *Automatically create a service account for this connector*. As an alternative you can also provide the Client ID and Client Secret of an existing service account. If you did set up the {product-kafka} ACL as explained in the previous task of this quick start, the newly created service account has sufficient rights to produce messages to the topic associated with the connector. Click *Next* to complete the step. |
| 110 | + |
| 111 | +. In this step you provide the connector specific configuration. |
| 112 | +.. Leave the *Data shape Format* to `application/octet-stream` |
| 113 | +.. *Topic Names*: Enter the name of the topic you created in the previous task of this quick start. |
| 114 | +.. *Content Type*: Leave to `text/plain`. |
| 115 | +.. *Message*: Enter the content of the message that you want to send to the topic, for example `Hello World!`. |
| 116 | +.. *Period*: The interval in milliseconds at which events will be sent. Set this to `10000`, which will produce an event every 10 seconds. |
| 117 | + |
| 118 | +. Finally configure the error handler for the connector. You can choose between *stop* (the connector shuts down in case of errors), *log* (the error is logged) or *dead letter queue* (the events that cannot be handled by the connector are sent to a dead letter topic of your Kafka instance). |
| 119 | +Select `log`. |
| 120 | + |
| 121 | +. The next screen shows the configuration properties of your connector. Click *Create connector* to proceed with the deployment of the connector. |
| 122 | ++ |
| 123 | +After you complete the connector setup, the new connector is listed in the connectors table. After a couple of seconds, the connector moves to the *Ready* state. At this point the connector starts producing messages to the Kafka topic associated with the connector. |
| 124 | ++ |
| 125 | +From the connectors table, you can stop, start and delete the connector, as well as edit the connector configuration by clicking the options icon (three vertical dots). |
| 126 | + |
| 127 | +.Verification |
| 128 | +ifdef::qs[] |
| 129 | +* Have you completed these steps? |
| 130 | +endif::[] |
| 131 | + |
| 132 | +[id="proc-creating-sink-connector_{context}"] |
| 133 | +== Creating a {product-long-connectors} Sink Connector |
| 134 | + |
| 135 | +[role="_abstract"] |
| 136 | +In this step you create and configure a sink connector. A sink connector consumes events from an Kafka topic and sends them to an external system. In this quickstart you use the HTTP Sink connector which consumes the messages produced by the source connector configured in the previous step and calls a HTTP endpoint with the message payload. |
| 137 | + |
| 138 | +.Procedure |
| 139 | +. Before continuing with the setup of the source connector, you need to setup a HTTP endpoint for the sink connector. One way to do so is to use the free link:https://webhook.site[webhook.site^] service. |
| 140 | +In a new browser tab, navigate to the link:https://webhook.site[webhook.site^], where you will be given a unique URL that you can leverage as HTTP sink for the connector. |
| 141 | +. In the {service-url-connectors}[^] web console, go to *Connectors* and click *Create connector instance*. Follow the guided steps to define the connector details. Click *Next* to complete each step and click *Finish* to finish the setup. |
| 142 | +. In the first step you select the connector you want to use. You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors. |
| 143 | ++ |
| 144 | +Enter *http* in the search box. You should see only one connector, called *HTTP Sink*, which is the sink connector you use in this quick start. Click the connector box to select the connector, and click *Next* to complete the step. |
| 145 | + |
| 146 | +. Select the {product-kafka} instance for the connector to work with. The name of the instance corresponds to the Kafka instance you configured in the previous step. Click on the Kafka instance box to select the instance and click *Next* to complete the step. |
| 147 | + |
| 148 | +. Select the OpenShift Dedicated cluster to host the connector instance. Select the OSD cluster by clicking the box representing the cluster, and click *Next* to complete the step. |
| 149 | + |
| 150 | +. In this step you configure the common configuration for your connector. Provide a unique name for the connector and select *Automatically create a service account for this connector*. As an alternative you can also provide the Client ID and Client Secret of an existing service account. If you did set up the {product-kafka} ACL as explained in the previous task of this quick start, the newly created service account has sufficient rights to consume messages from the topic associated with the connector. Click *Next* to complete the step. |
| 151 | + |
| 152 | +. In this step you provide the connector specific configuration. |
| 153 | +.. Leave the *Data shape Format* to `application/octet-stream` |
| 154 | +.. *Method*: Leave to `POST`. |
| 155 | +.. *URL*: Enter your unique URL from link:https://webhook.site[webhook.site^]. |
| 156 | +.. *Topic Names*: Enter the name of the topic you created in the first task of this quick start. Use the same topic as for the data sink connector. |
| 157 | + |
| 158 | +. Finally configure the error handler for the connector. You can choose between *stop* (the connector shuts down in case of errors), *log* (the error is logged) or *dead letter queue* (the events that cannot be handled by the connector are sent to a dead letter topic of your Kafka instance). |
| 159 | +Select `log`. |
| 160 | + |
| 161 | +. The next screen shows the configuration properties of your connector. Click *Create connector* to proceed with the deployment of the connector. |
| 162 | ++ |
| 163 | +After you complete the connector setup, the new connector is listed in the connectors table. After a couple of seconds, the connector moves to the *Ready* state. At this point the connector starts consuming messages from the Kafka topic associated with the connector and sending them to the HTTP sink. |
| 164 | + |
| 165 | +. In the browser tab pointing to link:https://webhook.site[webhook.site^] you should see the HTTP POST calls from the connector with the message contents as defined in the source connector. |
| 166 | + |
| 167 | + |
| 168 | + |
| 169 | +.Verification |
| 170 | +ifdef::qs[] |
| 171 | +* Have you completed these steps? |
| 172 | +endif::[] |
| 173 | + |
| 174 | +ifdef::qs[] |
| 175 | +[#conclusion] |
| 176 | +Congratulations! You successfully completed the {product-long-connectors} Getting Started quick start. |
| 177 | +endif::[] |
| 178 | + |
| 179 | +ifdef::parent-context[:context: {parent-context}] |
| 180 | +ifndef::parent-context[:!context:] |
0 commit comments