Skip to content

Commit aefdfa0

Browse files
authored
docs: added quick start guide for produce and consume in rhoas cli (#552)
* docs: added quick start guide for produce and consume in rhoas cli * fix: produce consume guide is now a draft
1 parent 1af8066 commit aefdfa0

File tree

2 files changed

+233
-0
lines changed

2 files changed

+233
-0
lines changed
Lines changed: 208 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,208 @@
1+
////
2+
START GENERATED ATTRIBUTES
3+
WARNING: This content is generated by running npm --prefix .build run generate:attributes
4+
////
5+
6+
//All OpenShift Application Services
7+
:org-name: Application Services
8+
:product-long-rhoas: OpenShift Application Services
9+
:community:
10+
:imagesdir: ./images
11+
:property-file-name: app-services.properties
12+
:samples-git-repo: https://github.com/redhat-developer/app-services-guides
13+
:base-url: https://github.com/redhat-developer/app-services-guides/tree/main/docs/
14+
:sso-token-url: https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token
15+
16+
//OpenShift Application Services CLI
17+
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
18+
:command-ref-url-cli: commands
19+
:installation-guide-url-cli: rhoas/rhoas-cli-installation/README.adoc
20+
:service-contexts-url-cli: rhoas/rhoas-service-contexts/README.adoc
21+
22+
//OpenShift Streams for Apache Kafka
23+
:product-long-kafka: OpenShift Streams for Apache Kafka
24+
:product-kafka: Streams for Apache Kafka
25+
:product-version-kafka: 1
26+
:service-url-kafka: https://console.redhat.com/application-services/streams/
27+
:getting-started-url-kafka: kafka/getting-started-kafka/README.adoc
28+
:kafka-bin-scripts-url-kafka: kafka/kafka-bin-scripts-kafka/README.adoc
29+
:kafkacat-url-kafka: kafka/kcat-kafka/README.adoc
30+
:quarkus-url-kafka: kafka/quarkus-kafka/README.adoc
31+
:nodejs-url-kafka: kafka/nodejs-kafka/README.adoc
32+
:getting-started-rhoas-cli-url-kafka: kafka/rhoas-cli-getting-started-kafka/README.adoc
33+
:topic-config-url-kafka: kafka/topic-configuration-kafka/README.adoc
34+
:consumer-config-url-kafka: kafka/consumer-configuration-kafka/README.adoc
35+
:access-mgmt-url-kafka: kafka/access-mgmt-kafka/README.adoc
36+
:metrics-monitoring-url-kafka: kafka/metrics-monitoring-kafka/README.adoc
37+
:service-binding-url-kafka: kafka/service-binding-kafka/README.adoc
38+
:message-browsing-url-kafka: kafka/message-browsing-kafka/README.adoc
39+
40+
//OpenShift Service Registry
41+
:product-long-registry: OpenShift Service Registry
42+
:product-registry: Service Registry
43+
:registry: Service Registry
44+
:product-version-registry: 1
45+
:service-url-registry: https://console.redhat.com/application-services/service-registry/
46+
:getting-started-url-registry: registry/getting-started-registry/README.adoc
47+
:quarkus-url-registry: registry/quarkus-registry/README.adoc
48+
:getting-started-rhoas-cli-url-registry: registry/rhoas-cli-getting-started-registry/README.adoc
49+
:access-mgmt-url-registry: registry/access-mgmt-registry/README.adoc
50+
:content-rules-registry: https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/9b0fdf14-f0d6-4d7f-8637-3ac9e2069817[Supported Service Registry content and rules]
51+
:service-binding-url-registry: registry/service-binding-registry/README.adoc
52+
53+
//OpenShift Connectors
54+
:product-long-connectors: OpenShift Connectors
55+
:product-connectors: Connectors
56+
:product-version-connectors: 1
57+
:service-url-connectors: https://console.redhat.com/application-services/connectors
58+
:getting-started-url-connectors: connectors/getting-started-connectors/README.adoc
59+
60+
//OpenShift API Designer
61+
:product-long-api-designer: OpenShift API Designer
62+
:product-api-designer: API Designer
63+
:product-version-api-designer: 1
64+
:service-url-api-designer: https://console.redhat.com/application-services/api-designer/
65+
:getting-started-url-api-designer: api-designer/getting-started-api-designer/README.adoc
66+
67+
//OpenShift API Management
68+
:product-long-api-management: OpenShift API Management
69+
:product-api-management: API Management
70+
:product-version-api-management: 1
71+
:service-url-api-management: https://console.redhat.com/application-services/api-management/
72+
73+
////
74+
END GENERATED ATTRIBUTES
75+
////
76+
77+
[id="chap-produce-consume-rhoas-cli"]
78+
= Getting started with producing and conuming messages in Rhoas Cli for {product-long-kafka}
79+
ifdef::context[:parent-context: {context}]
80+
:context: getting-started-produce-consume
81+
82+
// Purpose statement for the assembly
83+
[role="_abstract"]
84+
As a developer of applications and services, you can use the Rhoas Cli to create and read message in {product-long-kafka} and third-party systems.
85+
86+
In this example, you will produce messages to a kafka instance and consume them in the Rhoas Cli.
87+
88+
// Condition out QS-only content so that it doesn't appear in docs.
89+
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
90+
ifdef::qs[]
91+
[#description]
92+
====
93+
Learn how to produce and consume message in {product-long-rhoas}.
94+
====
95+
96+
[#introduction]
97+
====
98+
Welcome to the quick start for producing and consuming messages in the Rhoas Cli.
99+
100+
In this quick start, you will learn how to produce messages to a kafka instance and consume them in the Rhoas Cli.
101+
102+
103+
====
104+
endif::[]
105+
106+
ifndef::qs[]
107+
== Overview
108+
109+
{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
110+
111+
You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
112+
113+
The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
114+
115+
[.screencapture]
116+
.{product-long-connectors} data flow
117+
image::connectors-diagram.png[Illustration of data flow from data source through Kafka to data sink]
118+
119+
endif::[]
120+
121+
[id="proc-configure-kafka-instance_{context}"]
122+
== Configuring the {product-kafka} instance
123+
124+
[role="_abstract"]
125+
After you create a {product-kafka} instance, configure by performing the following tasks:
126+
127+
* Create *Kafka topics* to store messages sent by you and make them available to consumers.
128+
129+
For this example, you create one Kafka topic, named *test-topic* which will be used for all commands in the following examples.
130+
131+
ifdef::qs[]
132+
.Prerequisites
133+
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
134+
endif::[]
135+
136+
.Procedure
137+
. Create a Kafka topic for your kafka instance:
138+
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
139+
.. Click the name of the {product-kafka} instance that you created.
140+
.. Select the *Topics* tab, and then click *Create topic*.
141+
.. Type a unique name for your topic. For example, type *test-topic* for *Topic Name*.
142+
.. Accept the default settings for message retention, and replicas. But set the partition count to *2*.
143+
144+
ifdef::qs[]
145+
.Verification
146+
* Did you create a topic for the kafka instance?
147+
endif::[]
148+
149+
150+
[id="proc-produce-message_{context}"]
151+
== Producing a message to a {product-kafka} instance
152+
153+
[role="_abstract"]
154+
You can produce your own message from the Cli instead of using an application. This is very useful for testing and debuging your {product-kafka} instance.
155+
156+
.Prerequisites
157+
. You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
158+
. You configured a {product-kafka} instance for connectors as described _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
159+
. You are logged into the Rhoas Cli with your OpenShift Application Services account using `rhoas login`.
160+
161+
.Procedure
162+
. To produce a message to your kafka topic use the following command `rhoas kafka topic produce --name=test-topic` and enter a value when prompted, for example enter `Hello world!`.
163+
164+
. Read the message
165+
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
166+
.. Click the name of the {product-kafka} instance that you created.
167+
.. Select the *Topics* tab, and then click the name of your topic.
168+
.. Select the *Messages* tab, and see the message you create from the Rhoas Cli
169+
170+
. By default any message you create is sent to the *0* partition. To create a message for the *1* partition run the following and enter another value `rhoas kafka topic produce --name=test-topic --partition=0`.
171+
172+
. Go back to the messages tab in the topic and see if your message is now on a different partition.
173+
174+
.Verification
175+
* Does running the commands produce messages?
176+
177+
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
178+
.. Click the Kafka instance that you created.
179+
.. Click the *Topics* tab and then click the topic that you specified for your {product-kafka} instance.
180+
.. Click the *Messages* tab to see a `Hello World!` message.
181+
182+
183+
[id="proc-consume-message_{context}"]
184+
== Consuming messages from a {product-kafka} instance
185+
[role="_abstract"]
186+
You can consume your own message from the Cli instead of using an application. This is very useful for testing and debuging your {product-kafka} instance.
187+
188+
.Prerequisites
189+
. You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
190+
. You configured a {product-kafka} instance for connectors as described _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
191+
. You are logged into the Rhoas Cli with your OpenShift Application Services account using `rhoas login`.
192+
193+
.Procedure
194+
. To consume a message to your kafka topic use the following command `rhoas kafka topic consume --name=test-topic`. You will now see all messages you produced to the topic.
195+
. Just like produce set the `--partition` flag to consume from a specific partition. Run `rhoas kafka topic consume --name=test-topic --partition=1`. You will now see all messages you produced to the topic on the *1* partition.
196+
197+
.Verification
198+
* Does running the commands output messages on the correct partitions?
199+
200+
ifdef::qs[]
201+
[#conclusion]
202+
====
203+
Congratulations! You successfully completed producing and conuming messages in Rhoas Cli for {product-long-kafka} quick start.
204+
====
205+
endif::[]
206+
207+
ifdef::parent-context[:context: {parent-context}]
208+
ifndef::parent-context[:!context:]
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
apiVersion: console.openshift.io/v1
2+
kind: QuickStarts
3+
metadata:
4+
name: Produce and consume in Rhoas Cli
5+
annotations:
6+
draft: true
7+
order: 6
8+
spec:
9+
version: 0.1
10+
type:
11+
text: Quick Start
12+
color: green
13+
displayName: !snippet/title README.adoc#chap-produce-consume-rhoas-cli
14+
durationMinutes: 20
15+
icon: data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSIzOCIgaGVpZ2h0PSIzOCIgdmlld0JveD0iMCAwIDM4IDM4Ij48ZGVmcz48c3R5bGU+LmF7ZmlsbDojZmZmO30uYntmaWxsOiNlMDA7fTwvc3R5bGU+PC9kZWZzPjxwYXRoIGQ9Ik0yOCwxSDEwYTksOSwwLDAsMC05LDlWMjhhOSw5LDAsMCwwLDksOUgyOGE5LDksMCwwLDAsOS05VjEwYTksOSwwLDAsMC05LTlaIi8+PHBhdGggY2xhc3M9ImEiIGQ9Ik0yMiwyNS42MjVIMTNhLjYyNS42MjUsMCwwLDEsMC0xLjI1aDlhMi4zNzUsMi4zNzUsMCwwLDAsMC00Ljc1SDE1YTMuNjI1LDMuNjI1LDAsMCwxLDAtNy4yNUgyNWEuNjI1LjYyNSwwLDAsMSwwLDEuMjVIMTVhMi4zNzUsMi4zNzUsMCwwLDAsMCw0Ljc1aDdhMy42MjUsMy42MjUsMCwwLDEsMCw3LjI1WiIvPjxwYXRoIGNsYXNzPSJiIiBkPSJNMjUsMTYuNjI1QTMuNjI1LDMuNjI1LDAsMSwxLDI4LjYyNSwxMywzLjYyODg2LDMuNjI4ODYsMCwwLDEsMjUsMTYuNjI1Wm0wLTZBMi4zNzUsMi4zNzUsMCwxLDAsMjcuMzc1LDEzLDIuMzc3NywyLjM3NzcsMCwwLDAsMjUsMTAuNjI1WiIvPjxwYXRoIGNsYXNzPSJiIiBkPSJNMTMsMjguNjI1QTMuNjI1LDMuNjI1LDAsMSwxLDE2LjYyNSwyNSwzLjYyODg2LDMuNjI4ODYsMCwwLDEsMTMsMjguNjI1Wm0wLTZBMi4zNzUsMi4zNzUsMCwxLDAsMTUuMzc1LDI1LDIuMzc3NywyLjM3NzcsMCwwLDAsMTMsMjIuNjI1WiIvPjwvc3ZnPg==
16+
description: !snippet README.adoc#description
17+
prerequisites:
18+
- A Red Hat identity
19+
- You've created a Kafka instance and the instance is in `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
20+
introduction: !snippet README.adoc#introduction
21+
tasks:
22+
- !snippet/proc README.adoc#proc-configure-kafka-instance
23+
- !snippet/proc README.adoc#proc-produce-message
24+
- !snippet/proc README.adoc#proc-consume-message
25+
conclusion: !snippet README.adoc#conclusion

0 commit comments

Comments
 (0)