Skip to content

Commit e9754c1

Browse files
authored
alt text and code formatting
1 parent d4ba50a commit e9754c1

File tree

1 file changed

+56
-56
lines changed

1 file changed

+56
-56
lines changed

articles/event-hubs/schema-registry-kafka-java-send-receive-quickstart.md

Lines changed: 56 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ In this quickstart guide, we explore how to validate event from Apache Kafka app
1313

1414
In this use case a Kafka producer application uses Avro schema stored in Azure Schema Registry to, serialize the event and publish them to a Kafka topic/event hub in Azure Event Hubs. The Kafka consumer deserializes the events that it consumes from Event Hubs. For that it uses schema ID of the event and the Avro schema, which is stored in Azure Schema Registry.
1515

16-
:::image type="content" source="./media/schema-registry-overview/kafka-avro.png" alt-text="Schema serialization/de-serialization for Kafka applications using Avro schema." border="false":::
16+
:::image type="content" source="./media/schema-registry-overview/kafka-avro.png" alt-text="Diagram showing schema serialization/de-serialization for Kafka applications using Avro schema." border="false":::
1717

1818

1919

@@ -108,88 +108,88 @@ To update the Kafka Producer configuration, navigate to *azure-schema-registry-f
108108
client.secret=<>
109109
1. Follow the same instructions and update the *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer* configuration as well.
110110
1. For both Kafka producer and consumer applications, following Avro schema is used:
111-
```avro
112-
{
113-
"namespace": "com.azure.schemaregistry.samples",
114-
"type": "record",
115-
"name": "Order",
116-
"fields": [
117-
{
118-
"name": "id",
119-
"type": "string"
120-
},
121-
{
122-
"name": "amount",
123-
"type": "double"
124-
},
125-
{
126-
"name": "description",
127-
"type": "string"
128-
}
129-
]
130-
}
131-
```
111+
```avro
112+
{
113+
"namespace": "com.azure.schemaregistry.samples",
114+
"type": "record",
115+
"name": "Order",
116+
"fields": [
117+
{
118+
"name": "id",
119+
"type": "string"
120+
},
121+
{
122+
"name": "amount",
123+
"type": "double"
124+
},
125+
{
126+
"name": "description",
127+
"type": "string"
128+
}
129+
]
130+
}
131+
```
132132

133133

134134
## Using Kafka producer with Avro schema validation
135135
To run the Kafka producer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-producer*.
136136

137137
1. You can run the producer application so that it can produce Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
138-
```shell
139-
mvn generate-sources
140-
```
138+
```shell
139+
mvn generate-sources
140+
```
141141

142142
1. Then you can run the producer application using the following commands.
143143

144-
```shell
145-
mvn clean package
146-
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.producer.App"
147-
```
144+
```shell
145+
mvn clean package
146+
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.producer.App"
147+
```
148148

149149
1. Upon successful execution of the producer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - produce Avro SpecificRecords*.
150150

151-
```shell
152-
Enter case number:
153-
1 - produce Avro SpecificRecords
154-
2 - produce Avro GenericRecords
155-
```
151+
```shell
152+
Enter case number:
153+
1 - produce Avro SpecificRecords
154+
2 - produce Avro GenericRecords
155+
```
156156
157157
1. Upon successful data serialization and publishing, you should see the following console logs in your producer application:
158158
159-
```shell
160-
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
161-
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
162-
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
163-
```
159+
```shell
160+
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
161+
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
162+
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
163+
```
164164
165165
## Using Kafka consumer with Avro schema validation
166166
To run the Kafka consumer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer*.
167167
168168
1. You can run the consumer application so that it can consume Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
169-
```shell
170-
mvn generate-sources
171-
```
169+
```shell
170+
mvn generate-sources
171+
```
172172
173173
1. Then you can run the consumer application using the following command.
174-
```shell
175-
mvn clean package
176-
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.consumer.App"
177-
```
174+
```shell
175+
mvn clean package
176+
mvn -e clean compile exec:java -Dexec.mainClass="com.azure.schemaregistry.samples.consumer.App"
177+
```
178178
1. Upon successful execution of the consumer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - consume Avro SpecificRecords*.
179179
180-
```shell
181-
Enter case number:
182-
1 - consume Avro SpecificRecords
183-
2 - consume Avro GenericRecords
184-
```
180+
```shell
181+
Enter case number:
182+
1 - consume Avro SpecificRecords
183+
2 - consume Avro GenericRecords
184+
```
185185
186186
1. Upon successful data consumption and deserialization, you should see the following console logs in your producer application:
187187
188-
```shell
189-
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
190-
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
191-
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
192-
```
188+
```shell
189+
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
190+
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
191+
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
192+
```
193193
194194
## Clean up resources
195195
Delete the Event Hubs namespace or delete the resource group that contains the namespace.

0 commit comments

Comments
 (0)