- Each service (
restaurantservice,orderservice,notificationservice) has:- Spring Boot application class
spring-kafkaintegration and aKafkaConfigthat supports Confluent Cloud (SASL/SSL) and optional Schema Registry (Avro)- Kafka handler classes that use
KafkaTemplate<String, Object>and acceptObjectpayloads (schema optional)
orderserviceincludes a small REST endpointPOST /ordersthat publishes anOrderDtoto theorder-createdtopic so you can test the pipeline easily.pom.xmlfiles were updated to include optional Confluent dependencies (kafka-avro-serializer,kafka-schema-registry-client) and the Confluent Maven repository (placeholders kept).
How Confluent integration is wired (placeholders)
-
The services read the following properties from
src/main/resources/application.properties:spring.kafka.bootstrap-servers(e.g.pkc-xxxxx.confluent.cloud:9092)spring.kafka.properties.security.protocol(set toSASL_SSLfor Confluent Cloud)spring.kafka.properties.sasl.mechanism(set toPLAIN)spring.kafka.properties.sasl.jaas.config(example pattern below)
-
Optional Schema Registry (Avro) settings (if you want Avro + Schema Registry):
schema.registry.url(e.g.https://<SR_HOST>)schema.registry.basic.auth.credentials.source(usuallyUSER_INFO)schema.registry.basic.auth.user.info(e.g.<SR_API_KEY>:<SR_API_SECRET>)
How schema-optional behavior works in the code
KafkaConfigwill configure producers/consumers with String serializer/deserializer by default.- If
schema.registry.urlis set,KafkaConfigwill switch producer/consumer value serializer/deserializer to Confluent's Avro (KafkaAvroSerializer/KafkaAvroDeserializer) and set the schema registry URL and auth properties. - Kafka templates and listener container factories are built with
Object-typed values to allow either String payloads or Avro objects. - Handlers in each service accept
Objectand useKafkaTemplate<String, Object>.
Where to put your Confluent credentials
- Option A (file): Edit the
application.propertiesin each service and replace the<API_KEY>,<API_SECRET>, and schema registry placeholders. - Option B (env vars): Run the service with system properties or environment variable expansion (recommended in production).
Windows (cmd.exe) build & run commands
- Build a single service (from the service directory):
cd \spring-food-delivery-microservices\orderservice
mvnw.cmd -DskipTests package- Run the packaged JAR (from the service directory):
cd \spring-food-delivery-microservices\orderservice\target
java -jar orderservice-0.0.1-SNAPSHOT.jar- Or run with Maven in-place (useful during development):
cd \spring-food-delivery-microservices\orderservice
mvnw.cmd spring-boot:runSample application.properties snippets (placeholders)
- In
orderservice/src/main/resources/application.properties(already present):
spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.properties.security.protocol=SASL_SSL
spring.kafka.properties.sasl.mechanism=PLAIN
spring.kafka.properties.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="<API_KEY>" password="<API_SECRET>";
# Optional schema registry
schema.registry.url=https://<SCHEMA_REGISTRY_URL>
schema.registry.basic.auth.credentials.source=USER_INFO
schema.registry.basic.auth.user.info=<SR_API_KEY>:<SR_API_SECRET>Quick test (creates an order and publishes to Kafka)
- Start your Kafka stack (local Kafka or set Confluent Cloud credentials). Then run the
orderserviceand execute:
curl -X POST -H "Content-Type: application/json" -d "{"orderId":"o-1","restaurantId":"r-1","items":"burger,fries","totalCents":1599}" http://localhost:8081/orders