Skip to content

Commit 0d1c747

Browse files
Docs: Add documentation for Kafka message browser (#424)
* Initial content draft * Updates per latest UI design * Further updates per UI changes * Further edits to text * Update attributes * Update docs to reflect staged version of UI * Update attributes * Adjust proc structure * Updates from peer review
1 parent 313e2e0 commit 0d1c747

File tree

1 file changed

+158
-0
lines changed

1 file changed

+158
-0
lines changed
Lines changed: 158 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,158 @@
1+
////
2+
START GENERATED ATTRIBUTES
3+
WARNING: This content is generated by running npm --prefix .build run generate:attributes
4+
////
5+
6+
//OpenShift Application Services
7+
:org-name: Application Services
8+
:product-long-rhoas: OpenShift Application Services
9+
:community:
10+
:imagesdir: ./images
11+
:property-file-name: app-services.properties
12+
:samples-git-repo: https://github.com/redhat-developer/app-services-guides
13+
:base-url: https://github.com/redhat-developer/app-services-guides/tree/main/docs/
14+
15+
//OpenShift Application Services CLI
16+
:rhoas-cli-base-url: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
17+
:rhoas-cli-ref-url: commands
18+
:rhoas-cli-installation-url: rhoas/rhoas-cli-installation/README.adoc
19+
20+
//OpenShift Streams for Apache Kafka
21+
:product-long-kafka: OpenShift Streams for Apache Kafka
22+
:product-kafka: Streams for Apache Kafka
23+
:product-version-kafka: 1
24+
:service-url-kafka: https://console.redhat.com/application-services/streams/
25+
:getting-started-url-kafka: kafka/getting-started-kafka/README.adoc
26+
:kafka-bin-scripts-url-kafka: kafka/kafka-bin-scripts-kafka/README.adoc
27+
:kafkacat-url-kafka: kafka/kcat-kafka/README.adoc
28+
:quarkus-url-kafka: kafka/quarkus-kafka/README.adoc
29+
:nodejs-url-kafka: kafka/nodejs-kafka/README.adoc
30+
:rhoas-cli-getting-started-url-kafka: kafka/rhoas-cli-getting-started-kafka/README.adoc
31+
:topic-config-url-kafka: kafka/topic-configuration-kafka/README.adoc
32+
:consumer-config-url-kafka: kafka/consumer-configuration-kafka/README.adoc
33+
:access-mgmt-url-kafka: kafka/access-mgmt-kafka/README.adoc
34+
:metrics-monitoring-url-kafka: kafka/metrics-monitoring-kafka/README.adoc
35+
:service-binding-url-kafka: kafka/service-binding-kafka/README.adoc
36+
37+
//OpenShift Service Registry
38+
:product-long-registry: OpenShift Service Registry
39+
:product-registry: Service Registry
40+
:registry: Service Registry
41+
:product-version-registry: 1
42+
:service-url-registry: https://console.redhat.com/application-services/service-registry/
43+
:getting-started-url-registry: registry/getting-started-registry/README.adoc
44+
:quarkus-url-registry: registry/quarkus-registry/README.adoc
45+
:rhoas-cli-getting-started-url-registry: registry/rhoas-cli-getting-started-registry/README.adoc
46+
:access-mgmt-url-registry: registry/access-mgmt-registry/README.adoc
47+
:content-rules-registry: https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/9b0fdf14-f0d6-4d7f-8637-3ac9e2069817[Supported Service Registry content and rules]
48+
:service-binding-url-registry: registry/service-binding-registry/README.adoc
49+
50+
//OpenShift Connectors
51+
:product-long-connectors: OpenShift Connectors
52+
:service-url-connectors: https://console.redhat.com/application-services/connectors
53+
////
54+
END GENERATED ATTRIBUTES
55+
////
56+
57+
[id="chap-browsing-messages"]
58+
= Browsing messages in the {product-long-kafka} web console
59+
ifdef::context[:parent-context: {context}]
60+
:context: browsing-messages
61+
62+
// Purpose statement for the assembly
63+
[role="_abstract"]
64+
65+
As a developer or administrator, you can use the {product-long-kafka} web console to view and inspect messages for a Kafka topic. You might use this functionality, for example, to verify that a client is producing messages to the expected topic partition, that your topic is storing messages correctly, or that messages have the expected content.
66+
67+
When you select a topic in the console, you can use the *Messages* tab to view a list of messages for that topic. You can filter the list of messages in the following ways:
68+
69+
* Specify a partition and see messages sent to the partition.
70+
* Specify a partition and offset and see messages sent to the partition from that offset.
71+
* Specify a partition and a timestamp (that is, date and time) value. See messages sent to the partition from that date and time.
72+
* Specify a partition and a Unix epoch timestamp value and see messages sent to the partition from that epoch timestamp value.
73+
74+
//Additional line break to resolve mod docs generation error.
75+
76+
[id="proc-browsing-messages-for-a-topic_{context}"]
77+
== Browsing messages for a topic
78+
79+
The following procedure shows how to filter and inspect a list of messages for a topic in the {product-kafka} web console.
80+
81+
.Prerequisites
82+
83+
* You have a Kafka instance with a topic that contains some messages. To learn how to create your _first_ Kafka instance and topic and then send messages to the topic that will appear on the *Messages* page, see the following guides:
84+
+
85+
** {base-url}{getting-started-url-kafka}[_Getting started with {product-long-kafka}_^]
86+
** {base-url}{kafka-bin-scripts-url-kafka}[_Configuring and connecting Kafka scripts with {product-long-kafka}_^]
87+
88+
.Procedure
89+
90+
. In the {product-kafka} web console, click *Kafka Instances* in the left navigation menu.
91+
. On the *Kafka Instances* page, click a Kafka instance.
92+
. In your Kafka instance, click the *Topics* tab.
93+
. In the topics table, click a Kafka topic that you want to inspect.
94+
. In your topic, click the *Messages* tab.
95+
+
96+
By default, the *Messages* page shows messages in *partition 0* of your topic. You can change this partition value, as described later in the procedure.
97+
+
98+
The messages table includes columns for the following topic and message properties:
99+
+
100+
--
101+
* Partition
102+
* Offset
103+
* Timestamp (date and time)
104+
* Key
105+
* Headers
106+
* Value
107+
--
108+
109+
. To see complete data for a message, click *Show more* in the *Value*, *Key*, or *Headers* column.
110+
+
111+
A *Message* pane opens to show the complete message data. This pane also shows the epoch timestamp value.
112+
+
113+
[NOTE]
114+
--
115+
If you're using a schema in {product-long-registry} with your topic, the *Messages* page does not deserialize messages that a producer application serialized to conform to that schema. To view such messages, you must configure a consumer application to use a Kafka deserializer. For more information, see https://access.redhat.com/documentation/en-us/red_hat_integration/2021.q3/html-single/service_registry_user_guide/index#configuring-kafka-client-serdes[Configuring Kafka serializers/deserializers in Java clients^].
116+
117+
Similarly, if a message is encoded (for example, in a format such as UTF-8 or Base64), the *Messages* page does not decode the message.
118+
--
119+
120+
. To copy the full message value or header data, click the copy icon next to the data in the *Message* pane.
121+
122+
. To see messages for a different topic partition, select a new value in the *Partition* drop-down menu.
123+
+
124+
NOTE: If you have many partitions, you can filter the list shown in the drop-down menu by typing a value in the field.
125+
126+
. To further refine the list of messages in the table, use the filter controls at the top of the *Messages* page.
127+
+
128+
--
129+
* To filter messages by topic partition and offset, perform the following actions:
130+
... In the *Partition* field, select a topic partition.
131+
... In the drop-down menu that shows a default value of `Offset`, keep this default value.
132+
... In the *Offset* field, type an offset value.
133+
... To apply your filter settings, click the search (magnifying glass) icon.
134+
135+
* To filter messages by topic partition and date and time, perform the following actions:
136+
... In the *Partition* field, select a topic partition.
137+
... In the drop-down menu that shows a default value of `Offset`, change the value to `Timestamp`.
138+
+
139+
Additional selection tools appear.
140+
... Use the additional selection tools to set date and time values. Alternatively, type a date and time value in the format shown in the field.
141+
... To apply your filter settings, click the search (magnifying glass) icon.
142+
143+
* To filter messages by topic partition and epoch timestamp, perform the following actions:
144+
... In the *Partition* field, select a topic partition.
145+
... In the drop-down menu that shows a default value of `Offset`, change the value to `Epoch timestamp`.
146+
... In the *Epoch timestamp* field, type or paste an epoch timestamp value.
147+
+
148+
NOTE: You can easily convert a human-readable date and time to an epoch value using a https://www.epochconverter.com/[timestamp conversion tool^].
149+
... To apply your filter settings, click the search (magnifying glass) icon.
150+
151+
--
152+
+
153+
Based on your filter settings, the *Messages* page automatically reloads the list of messsages in the table.
154+
155+
. To clear your existing offset, timestamp, or epoch timestamp selections and revert to seeing the latest messages in the selected partition, select `Latest messages` in the drop-down menu that has a default value of `Offset`.
156+
157+
ifdef::parent-context[:context: {parent-context}]
158+
ifndef::parent-context[:!context:]

0 commit comments

Comments
 (0)