You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/event-hubs/schema-registry-json-schema-kafka.md
+56-55Lines changed: 56 additions & 55 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.author: kindrasiri
12
12
This tutorial walks you through a scenario where you use JSON Schemas to serialize and deserialize event using Azure Schema Registry in Event Hubs.
13
13
14
14
In this use case a Kafka producer application uses JSON schema stored in Azure Schema Registry to, serialize the event and publish them to a Kafka topic/event hub in Azure Event Hubs. The Kafka consumer deserializes the events that it consumes from Event Hubs. For that it uses schema ID of the event and JSON schema, which is stored in Azure Schema Registry.
15
-
:::image type="content" source="./media/schema-registry-overview/kafka-json-schema.png" alt-text="Schema serialization/de-serialization for Kafka applications using JSON schema." border="false":::
15
+
:::image type="content" source="./media/schema-registry-overview/kafka-json-schema.png" alt-text="Diagram showing the schema serialization/de-serialization for Kafka applications using JSON schema." border="false":::
16
16
17
17
## Prerequisites
18
18
If you're new to Azure Event Hubs, see [Event Hubs overview](event-hubs-about.md) before you do this quickstart.
@@ -103,88 +103,89 @@ To update the Kafka Producer configuration, navigate to *azure-schema-registry-f
103
103
client.secret=<>
104
104
1. Follow the same instructions and update the *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer* configuration as well.
105
105
1. For both Kafka producer and consumer applications, following JSON schema is used:
## Using Kafka producer with JSON schema validation
131
132
To run the Kafka producer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-producer*.
132
133
133
134
1. You can run the producer application so that it can produce Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
134
-
```shell
135
-
mvn generate-sources
136
-
```
135
+
```shell
136
+
mvn generate-sources
137
+
```
137
138
138
139
1. Then you can run the producer application using the following commands.
1. Upon successful execution of the producer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - produce SpecificRecords*.
146
147
147
-
```shell
148
-
Enter case number:
149
-
1 - produce SpecificRecords
150
-
```
148
+
```shell
149
+
Enter case number:
150
+
1 - produce SpecificRecords
151
+
```
151
152
152
153
1. Upon successful data serialization and publishing, you should see the following console logs in your producer application:
153
154
154
-
```shell
155
-
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 0
156
-
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 1
157
-
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 2
158
-
```
155
+
```shell
156
+
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 0
157
+
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 1
158
+
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 2
159
+
```
159
160
160
161
## Using Kafka consumer with JSON schema validation
161
162
To run the Kafka consumer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer*.
162
163
163
164
1. You can run the consumer application so that it can consume Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
164
-
```shell
165
-
mvn generate-sources
166
-
```
165
+
```shell
166
+
mvn generate-sources
167
+
```
167
168
168
169
1. Then you can run the consumer application using the following command.
1. Upon successful execution of the consumer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - consume Avro SpecificRecords*.
174
175
175
-
```shell
176
-
Enter case number:
177
-
1 - consume SpecificRecords
178
-
```
176
+
```shell
177
+
Enter case number:
178
+
1 - consume SpecificRecords
179
+
```
179
180
180
181
1. Upon successful data consumption and deserialization, you should see the following console logs in your producer application:
181
182
182
-
```shell
183
-
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 0, merchantId=Merchant Id 0, transactionValueUsd=0, userId=User Id 0}
184
-
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 1, merchantId=Merchant Id 1, transactionValueUsd=1, userId=User Id 1}
185
-
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 2, merchantId=Merchant Id 2, transactionValueUsd=2, userId=User Id 2}
183
+
```shell
184
+
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 0, merchantId=Merchant Id 0, transactionValueUsd=0, userId=User Id 0}
185
+
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 1, merchantId=Merchant Id 1, transactionValueUsd=1, userId=User Id 1}
186
+
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 2, merchantId=Merchant Id 2, transactionValueUsd=2, userId=User Id 2}
186
187
187
-
```
188
+
```
188
189
189
190
## Clean up resources
190
191
Delete the Event Hubs namespace or delete the resource group that contains the namespace.
0 commit comments