You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/event-hubs/schema-registry-kafka-java-send-receive-quickstart.md
+56-56Lines changed: 56 additions & 56 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ In this quickstart guide, we explore how to validate event from Apache Kafka app
13
13
14
14
In this use case a Kafka producer application uses Avro schema stored in Azure Schema Registry to, serialize the event and publish them to a Kafka topic/event hub in Azure Event Hubs. The Kafka consumer deserializes the events that it consumes from Event Hubs. For that it uses schema ID of the event and the Avro schema, which is stored in Azure Schema Registry.
15
15
16
-
:::image type="content" source="./media/schema-registry-overview/kafka-avro.png" alt-text="Schema serialization/de-serialization for Kafka applications using Avro schema." border="false":::
16
+
:::image type="content" source="./media/schema-registry-overview/kafka-avro.png" alt-text="Diagram showing schema serialization/de-serialization for Kafka applications using Avro schema." border="false":::
17
17
18
18
19
19
@@ -108,88 +108,88 @@ To update the Kafka Producer configuration, navigate to *azure-schema-registry-f
108
108
client.secret=<>
109
109
1. Follow the same instructions and update the *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer* configuration as well.
110
110
1. For both Kafka producer and consumer applications, following Avro schema is used:
111
-
```avro
112
-
{
113
-
"namespace": "com.azure.schemaregistry.samples",
114
-
"type": "record",
115
-
"name": "Order",
116
-
"fields": [
117
-
{
118
-
"name": "id",
119
-
"type": "string"
120
-
},
121
-
{
122
-
"name": "amount",
123
-
"type": "double"
124
-
},
125
-
{
126
-
"name": "description",
127
-
"type": "string"
128
-
}
129
-
]
130
-
}
131
-
```
111
+
```avro
112
+
{
113
+
"namespace": "com.azure.schemaregistry.samples",
114
+
"type": "record",
115
+
"name": "Order",
116
+
"fields": [
117
+
{
118
+
"name": "id",
119
+
"type": "string"
120
+
},
121
+
{
122
+
"name": "amount",
123
+
"type": "double"
124
+
},
125
+
{
126
+
"name": "description",
127
+
"type": "string"
128
+
}
129
+
]
130
+
}
131
+
```
132
132
133
133
134
134
## Using Kafka producer with Avro schema validation
135
135
To run the Kafka producer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-producer*.
136
136
137
137
1. You can run the producer application so that it can produce Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
138
-
```shell
139
-
mvn generate-sources
140
-
```
138
+
```shell
139
+
mvn generate-sources
140
+
```
141
141
142
142
1. Then you can run the producer application using the following commands.
1. Upon successful execution of the producer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - produce Avro SpecificRecords*.
150
150
151
-
```shell
152
-
Enter case number:
153
-
1 - produce Avro SpecificRecords
154
-
2 - produce Avro GenericRecords
155
-
```
151
+
```shell
152
+
Enter case number:
153
+
1 - produce Avro SpecificRecords
154
+
2 - produce Avro GenericRecords
155
+
```
156
156
157
157
1. Upon successful data serialization and publishing, you should see the following console logs in your producer application:
158
158
159
-
```shell
160
-
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
161
-
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
162
-
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
163
-
```
159
+
```shell
160
+
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
161
+
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
162
+
INFO com.azure.schemaregistry.samples.producer.KafkaAvroSpecificRecord - Sent Order {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
163
+
```
164
164
165
165
## Using Kafka consumer with Avro schema validation
166
166
To run the Kafka consumer application, navigate to *azure-schema-registry-for-kafka/tree/master/java/avro/samples/kafka-consumer*.
167
167
168
168
1. You can run the consumer application so that it can consume Avro specific records or generic records. For specific records mode you need to first generate the classes against either the producer schema using the following maven command:
169
-
```shell
170
-
mvn generate-sources
171
-
```
169
+
```shell
170
+
mvn generate-sources
171
+
```
172
172
173
173
1. Then you can run the consumer application using the following command.
1. Upon successful execution of the consumer application, it prompts you to choose the producer scenario. For this quickstart, you can choose option *1 - consume Avro SpecificRecords*.
179
179
180
-
```shell
181
-
Enter case number:
182
-
1 - consume Avro SpecificRecords
183
-
2 - consume Avro GenericRecords
184
-
```
180
+
```shell
181
+
Enter case number:
182
+
1 - consume Avro SpecificRecords
183
+
2 - consume Avro GenericRecords
184
+
```
185
185
186
186
1. Upon successful data consumption and deserialization, you should see the following console logs in your producer application:
187
187
188
-
```shell
189
-
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
190
-
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
191
-
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
192
-
```
188
+
```shell
189
+
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-0", "amount": 10.0, "description": "Sample order 0"}
190
+
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-1", "amount": 11.0, "description": "Sample order 1"}
191
+
INFO com.azure.schemaregistry.samples.consumer.KafkaAvroSpecificRecord - Order received: {"id": "ID-2", "amount": 12.0, "description": "Sample order 2"}
192
+
```
193
193
194
194
## Clean up resources
195
195
Delete the Event Hubs namespace or delete the resource group that contains the namespace.
0 commit comments