Skip to content

bohdant-of/spring-food-delivery-microservices

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Spring Food Delivery Microservices (Spring Kafka + Confluent)

  • Each service (restaurantservice, orderservice, notificationservice) has:
    • Spring Boot application class
    • spring-kafka integration and a KafkaConfig that supports Confluent Cloud (SASL/SSL) and optional Schema Registry (Avro)
    • Kafka handler classes that use KafkaTemplate<String, Object> and accept Object payloads (schema optional)
  • orderservice includes a small REST endpoint POST /orders that publishes an OrderDto to the order-created topic so you can test the pipeline easily.
  • pom.xml files were updated to include optional Confluent dependencies (kafka-avro-serializer, kafka-schema-registry-client) and the Confluent Maven repository (placeholders kept).

How Confluent integration is wired (placeholders)

  • The services read the following properties from src/main/resources/application.properties:

    • spring.kafka.bootstrap-servers (e.g. pkc-xxxxx.confluent.cloud:9092)
    • spring.kafka.properties.security.protocol (set to SASL_SSL for Confluent Cloud)
    • spring.kafka.properties.sasl.mechanism (set to PLAIN)
    • spring.kafka.properties.sasl.jaas.config (example pattern below)
  • Optional Schema Registry (Avro) settings (if you want Avro + Schema Registry):

    • schema.registry.url (e.g. https://<SR_HOST>)
    • schema.registry.basic.auth.credentials.source (usually USER_INFO)
    • schema.registry.basic.auth.user.info (e.g. <SR_API_KEY>:<SR_API_SECRET>)

How schema-optional behavior works in the code

  • KafkaConfig will configure producers/consumers with String serializer/deserializer by default.
  • If schema.registry.url is set, KafkaConfig will switch producer/consumer value serializer/deserializer to Confluent's Avro (KafkaAvroSerializer/KafkaAvroDeserializer) and set the schema registry URL and auth properties.
  • Kafka templates and listener container factories are built with Object-typed values to allow either String payloads or Avro objects.
  • Handlers in each service accept Object and use KafkaTemplate<String, Object>.

Where to put your Confluent credentials

  • Option A (file): Edit the application.properties in each service and replace the <API_KEY>, <API_SECRET>, and schema registry placeholders.
  • Option B (env vars): Run the service with system properties or environment variable expansion (recommended in production).

Windows (cmd.exe) build & run commands

  • Build a single service (from the service directory):
cd \spring-food-delivery-microservices\orderservice
mvnw.cmd -DskipTests package
  • Run the packaged JAR (from the service directory):
cd \spring-food-delivery-microservices\orderservice\target
java -jar orderservice-0.0.1-SNAPSHOT.jar
  • Or run with Maven in-place (useful during development):
cd \spring-food-delivery-microservices\orderservice
mvnw.cmd spring-boot:run

Sample application.properties snippets (placeholders)

  • In orderservice/src/main/resources/application.properties (already present):
spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.properties.security.protocol=SASL_SSL
spring.kafka.properties.sasl.mechanism=PLAIN
spring.kafka.properties.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="<API_KEY>" password="<API_SECRET>";

# Optional schema registry
schema.registry.url=https://<SCHEMA_REGISTRY_URL>
schema.registry.basic.auth.credentials.source=USER_INFO
schema.registry.basic.auth.user.info=<SR_API_KEY>:<SR_API_SECRET>

Quick test (creates an order and publishes to Kafka)

  • Start your Kafka stack (local Kafka or set Confluent Cloud credentials). Then run the orderservice and execute:
curl -X POST -H "Content-Type: application/json" -d "{"orderId":"o-1","restaurantId":"r-1","items":"burger,fries","totalCents":1599}" http://localhost:8081/orders

About

This repository contains a microservices-based food delivery backend using Spring Boot and Spring Kafka, with optional Confluent Cloud/Schema Registry integration.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages