Skip to content

Commit d4ba50a

Browse files
authored
alt text and code formatting
1 parent 1c4a319 commit d4ba50a

File tree

1 file changed

+56
-55
lines changed

1 file changed

+56
-55
lines changed

articles/event-hubs/schema-registry-json-schema-kafka.md

Lines changed: 56 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.author: kindrasiri
1212
This tutorial walks you through a scenario where you use JSON Schemas to serialize and deserialize event using Azure Schema Registry in Event Hubs.
1313

1414
In this use case a Kafka producer application uses JSON schema stored in Azure Schema Registry to, serialize the event and publish them to a Kafka topic/event hub in Azure Event Hubs. The Kafka consumer deserializes the events that it consumes from Event Hubs. For that it uses schema ID of the event and JSON schema, which is stored in Azure Schema Registry.
15-
:::image type="content" source="./media/schema-registry-overview/kafka-json-schema.png" alt-text="Schema serialization/de-serialization for Kafka applications using JSON schema." border="false":::
15+
:::image type="content" source="./media/schema-registry-overview/kafka-json-schema.png" alt-text="Diagram showing the schema serialization/de-serialization for Kafka applications using JSON schema." border="false":::
1616

1717
## Prerequisites
1818
If you're new to Azure Event Hubs, see [Event Hubs overview](event-hubs-about.md) before you do this quickstart.
@@ -103,88 +103,89 @@ To update the Kafka Producer configuration, navigate to *azure-schema-registry-f
103103
client.secret=<>
104104
1. Follow the same instructions and update the *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer* configuration as well.
105105
1. For both Kafka producer and consumer applications, following JSON schema is used:
106-
```json
107-
{
108-
"$id": "https://example.com/person.schema.json",
109-
"$schema": "https://json-schema.org/draft/2020-12/schema",
110-
"title": "CustomerInvoice",
111-
"type": "object",
112-
"properties": {
113-
"invoiceId": {
114-
"type": "string"
115-
},
116-
"merchantId": {
117-
"type": "string"
118-
},
119-
"transactionValueUsd": {
120-
"type": "integer"
121-
},
122-
"userId": {
123-
"type": "string"
124-
}
125-
}
126-
}
127-
```
106+
107+
```json
108+
{
109+
"$id": "https://example.com/person.schema.json",
110+
"$schema": "https://json-schema.org/draft/2020-12/schema",
111+
"title": "CustomerInvoice",
112+
"type": "object",
113+
"properties": {
114+
"invoiceId": {
115+
"type": "string"
116+
},
117+
"merchantId": {
118+
"type": "string"
119+
},
120+
"transactionValueUsd": {
121+
"type": "integer"
122+
},
123+
"userId": {
124+
"type": "string"
125+
}
126+
}
127+
}
128+
```
128129

129130

130131
## Using Kafka producer with JSON schema validation
131132
To run the Kafka producer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-producer*.
132133

133134
1. You can run the producer application so that it can produce Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
134-
```shell
135-
mvn generate-sources
136-
```
135+
```shell
136+
mvn generate-sources
137+
```
137138

138139
1. Then you can run the producer application using the following commands.
139140

140-
```shell
141-
mvn clean package
142-
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.producer.App"
143-
```
141+
```shell
142+
mvn clean package
143+
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.producer.App"
144+
```
144145

145146
1. Upon successful execution of the producer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - produce SpecificRecords*.
146147

147-
```shell
148-
Enter case number:
149-
1 - produce SpecificRecords
150-
```
148+
```shell
149+
Enter case number:
150+
1 - produce SpecificRecords
151+
```
151152
152153
1. Upon successful data serialization and publishing, you should see the following console logs in your producer application:
153154
154-
```shell
155-
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 0
156-
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 1
157-
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 2
158-
```
155+
```shell
156+
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 0
157+
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 1
158+
INFO com.azure.schemaregistry.samples.producer.KafkaJsonSpecificRecord - Sent Order Invoice 2
159+
```
159160
160161
## Using Kafka consumer with JSON schema validation
161162
To run the Kafka consumer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer*.
162163
163164
1. You can run the consumer application so that it can consume Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
164-
```shell
165-
mvn generate-sources
166-
```
165+
```shell
166+
mvn generate-sources
167+
```
167168
168169
1. Then you can run the consumer application using the following command.
169-
```shell
170-
mvn clean package
171-
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.consumer.App"
172-
```
170+
```shell
171+
mvn clean package
172+
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.consumer.App"
173+
```
173174
1. Upon successful execution of the consumer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - consume Avro SpecificRecords*.
174175
175-
```shell
176-
Enter case number:
177-
1 - consume SpecificRecords
178-
```
176+
```shell
177+
Enter case number:
178+
1 - consume SpecificRecords
179+
```
179180
180181
1. Upon successful data consumption and deserialization, you should see the following console logs in your producer application:
181182
182-
```shell
183-
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 0, merchantId=Merchant Id 0, transactionValueUsd=0, userId=User Id 0}
184-
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 1, merchantId=Merchant Id 1, transactionValueUsd=1, userId=User Id 1}
185-
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 2, merchantId=Merchant Id 2, transactionValueUsd=2, userId=User Id 2}
183+
```shell
184+
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 0, merchantId=Merchant Id 0, transactionValueUsd=0, userId=User Id 0}
185+
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 1, merchantId=Merchant Id 1, transactionValueUsd=1, userId=User Id 1}
186+
INFO com.azure.schemaregistry.samples.consumer.KafkaJsonSpecificRecord - Invoice received: {invoiceId=Invoice 2, merchantId=Merchant Id 2, transactionValueUsd=2, userId=User Id 2}
186187

187-
```
188+
```
188189
189190
## Clean up resources
190191
Delete the Event Hubs namespace or delete the resource group that contains the namespace.

0 commit comments

Comments
 (0)