|
| 1 | +//// |
| 2 | +START GENERATED ATTRIBUTES |
| 3 | +WARNING: This content is generated by running npm --prefix .build run generate:attributes |
| 4 | +//// |
| 5 | + |
| 6 | +//All OpenShift Application Services |
| 7 | +:org-name: Application Services |
| 8 | +:product-long-rhoas: OpenShift Application Services |
| 9 | +:community: |
| 10 | +:imagesdir: ./images |
| 11 | +:property-file-name: app-services.properties |
| 12 | +:samples-git-repo: https://github.com/redhat-developer/app-services-guides |
| 13 | +:base-url: https://github.com/redhat-developer/app-services-guides/tree/main/docs/ |
| 14 | +:sso-token-url: https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token |
| 15 | +:cloud-console-url: https://console.redhat.com/ |
| 16 | +:service-accounts-url: https://console.redhat.com/application-services/service-accounts |
| 17 | + |
| 18 | +//OpenShift Application Services CLI |
| 19 | +:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/ |
| 20 | +:command-ref-url-cli: commands |
| 21 | +:installation-guide-url-cli: rhoas/rhoas-cli-installation/README.adoc |
| 22 | +:service-contexts-url-cli: rhoas/rhoas-service-contexts/README.adoc |
| 23 | + |
| 24 | +//OpenShift Streams for Apache Kafka |
| 25 | +:product-long-kafka: OpenShift Streams for Apache Kafka |
| 26 | +:product-kafka: Streams for Apache Kafka |
| 27 | +:product-version-kafka: 1 |
| 28 | +:service-url-kafka: https://console.redhat.com/application-services/streams/ |
| 29 | +:getting-started-url-kafka: kafka/getting-started-kafka/README.adoc |
| 30 | +:kafka-bin-scripts-url-kafka: kafka/kafka-bin-scripts-kafka/README.adoc |
| 31 | +:kafkacat-url-kafka: kafka/kcat-kafka/README.adoc |
| 32 | +:quarkus-url-kafka: kafka/quarkus-kafka/README.adoc |
| 33 | +:nodejs-url-kafka: kafka/nodejs-kafka/README.adoc |
| 34 | +:getting-started-rhoas-cli-url-kafka: kafka/rhoas-cli-getting-started-kafka/README.adoc |
| 35 | +:topic-config-url-kafka: kafka/topic-configuration-kafka/README.adoc |
| 36 | +:consumer-config-url-kafka: kafka/consumer-configuration-kafka/README.adoc |
| 37 | +:access-mgmt-url-kafka: kafka/access-mgmt-kafka/README.adoc |
| 38 | +:metrics-monitoring-url-kafka: kafka/metrics-monitoring-kafka/README.adoc |
| 39 | +:service-binding-url-kafka: kafka/service-binding-kafka/README.adoc |
| 40 | +:message-browsing-url-kafka: kafka/message-browsing-kafka/README.adoc |
| 41 | + |
| 42 | +//OpenShift Service Registry |
| 43 | +:product-long-registry: OpenShift Service Registry |
| 44 | +:product-registry: Service Registry |
| 45 | +:registry: Service Registry |
| 46 | +:product-version-registry: 1 |
| 47 | +:service-url-registry: https://console.redhat.com/application-services/service-registry/ |
| 48 | +:getting-started-url-registry: registry/getting-started-registry/README.adoc |
| 49 | +:quarkus-url-registry: registry/quarkus-registry/README.adoc |
| 50 | +:getting-started-rhoas-cli-url-registry: registry/rhoas-cli-getting-started-registry/README.adoc |
| 51 | +:access-mgmt-url-registry: registry/access-mgmt-registry/README.adoc |
| 52 | +:content-rules-registry: https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/9b0fdf14-f0d6-4d7f-8637-3ac9e2069817[Supported Service Registry content and rules] |
| 53 | +:service-binding-url-registry: registry/service-binding-registry/README.adoc |
| 54 | + |
| 55 | +//OpenShift Connectors |
| 56 | +:product-long-connectors: OpenShift Connectors |
| 57 | +:product-connectors: Connectors |
| 58 | +:product-version-connectors: 1 |
| 59 | +:service-url-connectors: https://console.redhat.com/application-services/connectors |
| 60 | +:getting-started-url-connectors: connectors/getting-started-connectors/README.adoc |
| 61 | + |
| 62 | +//OpenShift API Designer |
| 63 | +:product-long-api-designer: OpenShift API Designer |
| 64 | +:product-api-designer: API Designer |
| 65 | +:product-version-api-designer: 1 |
| 66 | +:service-url-api-designer: https://console.redhat.com/application-services/api-designer/ |
| 67 | +:getting-started-url-api-designer: api-designer/getting-started-api-designer/README.adoc |
| 68 | + |
| 69 | +//OpenShift API Management |
| 70 | +:product-long-api-management: OpenShift API Management |
| 71 | +:product-api-management: API Management |
| 72 | +:product-version-api-management: 1 |
| 73 | +:service-url-api-management: https://console.redhat.com/application-services/api-management/ |
| 74 | + |
| 75 | +//// |
| 76 | +END GENERATED ATTRIBUTES |
| 77 | +//// |
| 78 | + |
| 79 | +[id="chap-connectors-rhoas-cli"] |
| 80 | += Interacting with {product-long-connectors} the rhoas CLI |
| 81 | +ifdef::context[:parent-context: {context}] |
| 82 | +:context: connectors-rhoas-cli |
| 83 | + |
| 84 | +// Purpose statement for the assembly |
| 85 | +[role="_abstract"] |
| 86 | +As a developer of {product-connectors}, you can use the `rhoas` command-line interface (CLI) to control and manage your Connectors instances and namespaces. |
| 87 | + |
| 88 | +.Prerequisites |
| 89 | +ifndef::community[] |
| 90 | +* You have a Red Hat account. |
| 91 | +endif::[] |
| 92 | +* You have a running Kafka instance in {product-kafka} with a topic called `my-topic`. |
| 93 | +* You've installed the latest version of the `rhoas` CLI. See {base-url}{installation-guide-url-cli}[Installing and configuring the rhoas CLI^]. |
| 94 | + |
| 95 | +// Condition out QS-only content so that it doesn't appear in docs. |
| 96 | +// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files. |
| 97 | +ifdef::qs[] |
| 98 | +[#description] |
| 99 | +==== |
| 100 | +Learn how to use the `rhoas` command-line interface (CLI) to produce and consume messages for a Kafka instance. |
| 101 | +==== |
| 102 | + |
| 103 | +[#introduction] |
| 104 | +==== |
| 105 | +Welcome to the quick start for producing and consuming Kafka messages using the `rhoas` command-line interface (CLI). |
| 106 | +
|
| 107 | +In this quick start, you'll use a CLI command to produce messages to different topic partitions in a Kafka instance. You'll then use the {product-long-kafka} web console to inspect the messages. When you're ready, you'll use another CLI command to consume the messages. |
| 108 | +==== |
| 109 | +endif::[] |
| 110 | + |
| 111 | +[id="proc-building-connector-configuration{context}"] |
| 112 | +== Building a connectors configuration |
| 113 | + |
| 114 | +[role="_abstract"] |
| 115 | +In this task you will create a configuration file that can be used to create a {product-long-connectors} instance. |
| 116 | + |
| 117 | +.Procedure |
| 118 | +. Log in to the `rhoas` CLI. |
| 119 | ++ |
| 120 | +[source] |
| 121 | +---- |
| 122 | +$ rhoas login |
| 123 | +---- |
| 124 | + |
| 125 | +. Start building a configuration for a {{product-long-connectors} instance. |
| 126 | ++ |
| 127 | +[source,subs="+quotes"] |
| 128 | +---- |
| 129 | +$ rhoas connector build --type=data_generator_0.1 |
| 130 | +---- |
| 131 | + |
| 132 | +You're prompted to enter details based on the Connectors instance type provided. |
| 133 | + |
| 134 | +. Enter `my-topic` as the topic names value. |
| 135 | + |
| 136 | +. Accept the default Content Type by pressing enter when prompted. |
| 137 | + |
| 138 | +. Enter `Hello World!` as the message value. |
| 139 | + |
| 140 | +. Accept the default Period. |
| 141 | + |
| 142 | +.Verification |
| 143 | +ifdef::qs[] |
| 144 | +* Is there a file called `connector.json` in the current working directory? |
| 145 | +endif::[] |
| 146 | +ifndef::qs[] |
| 147 | +* Verify that the connector specific configuration is set in the `connector.json` file. |
| 148 | +endif::[] |
| 149 | + |
| 150 | +[id="proc-create-connector{context}"] |
| 151 | +== Creating a {product-long-connectors} from Rhoas CLI |
| 152 | +[role="_abstract"] |
| 153 | +Once you have build a configuration you can create the Connectors instance from the Rhoas CLI. |
| 154 | + |
| 155 | +.Prerequisites |
| 156 | +* You have buillt a Connectors configuration and it is saved locally as `connectors.json`. |
| 157 | +* You have a {product-long-kafka} instance running and have a topic called `my-topic`. |
| 158 | +* You have a service account created that has read and write access to the kafka topic and you have the credentials saved for use. |
| 159 | + |
| 160 | +.Procedure |
| 161 | +. Create a evaluation namespace that the connectors instance will use. |
| 162 | ++ |
| 163 | +[source,subs="+quotes"] |
| 164 | +---- |
| 165 | +$ rhoas namespace create |
| 166 | +---- |
| 167 | + |
| 168 | +. Run the create command and pass in the current configuration. |
| 169 | ++ |
| 170 | +[source,subs="+quotes"] |
| 171 | +---- |
| 172 | +$ rhoas connector create --file=connector.json |
| 173 | +---- |
| 174 | + |
| 175 | +. You're prompted to enter details to create the {product-connectors} instance. |
| 176 | + |
| 177 | ++ |
| 178 | +Enter `my-connector` as the connector's name. |
| 179 | ++ |
| 180 | +Select the namespace you created when prompted. |
| 181 | ++ |
| 182 | +If prompted select the kafka instance you created with the topic you want to send messages too. |
| 183 | ++ |
| 184 | +Enter the Service Account Client Id when prompted. |
| 185 | ++ |
| 186 | +Enter the Service Account Client Secret when prompted. |
| 187 | + |
| 188 | +. Once the {product-long-connectors} is running run the following command to see if the connector is producing messages as configured. |
| 189 | + |
| 190 | +[source,subs="+quotes"] |
| 191 | +---- |
| 192 | +$ rhoas kafka topic consume --name=my-topic --partition=0 --wait |
| 193 | +---- |
| 194 | + |
| 195 | +.Verification |
| 196 | +* Is there a connectors instance running called `my-connector`. |
| 197 | +* Is there messages being received as expected? |
| 198 | + |
| 199 | +ifdef::qs[] |
| 200 | +[#conclusion] |
| 201 | +==== |
| 202 | +Congratulations! You successfully completed the quick start for creating a connectors instance with the `rhoas` CLI. |
| 203 | +==== |
| 204 | +endif::[] |
| 205 | + |
| 206 | +ifdef::parent-context[:context: {parent-context}] |
| 207 | +ifndef::parent-context[:!context:] |
0 commit comments