DEV Community

Cover image for Event-Driven Microservices with Apache Kafka And Spring Boot: A Practical Guide
William
William

Posted on

Event-Driven Microservices with Apache Kafka And Spring Boot: A Practical Guide

Introduction

In modern microservices architecture, event-driven systems play a crucial role in enabling scalable, decoupled, and efficient communication between services. Apache Kafka has emerged as a leading distributed event-streaming platform, widely used for real-time data processing.

In this article, we will explore how to build an event-driven microservices system using Java Spring Boot and Apache Kafka, with a practical example of decoupling order processing from inventory management.

Why Event-Driven Microservices?

Traditional REST-based communication introduces tight coupling between services, making them harder to scale and maintain. Event-driven microservices solve this problem by allowing services to communicate asynchronously using events.

Benefits of Event-Driven Architecture:

  • Decoupling: Services interact via events, reducing direct dependencies.
  • Scalability: Independent services can scale as needed.
  • Resilience: Failures in one service do not immediately impact others.
  • Flexibility: Easily extend systems by adding new consumers.

Setting Up Apache Kafka

Before implementing our example, make sure you have Apache Kafka installed and running.

  1. Download and Extract Kafka:
  Head over to this website [Download Apache Kafka](https://kafka.apache.org/downloads)
unzip to your favorite location
for this article i created a folder called software in local c on windows and rename kafka_2.13-3.0.0 to just kafka
Enter fullscreen mode Exit fullscreen mode
  1. Start Zookeeper and Kafka Broker:
    FOR ZOOKEEPER

   * To Start Zookeeper
   * open command prompt or terminal and CD into C:\softwares\kafka\bin\windows>
  * Then run this,  zookeeper-server-start.bat ..\..\config\zookeeper.properties 

FOR KAFKA BROKER 
  * To start kafka broker
  * open another command prompt or terminal and CD into C:\softwares\kafka\bin\windows>
  * Then run this, kafka-server-start.bat ..\..\config\server.properties
Enter fullscreen mode Exit fullscreen mode

Experiment with Kafka

  • create a Topic - With another cmd instance open with same path you cd into to run both kafka broker and zookper while your kafka broker and zookeeper is still open, then run this command to create a kafka topic, kafka-topic.bat --bootstrap-server --create --topic order-topic --partitions 3 --replication-factor 1
    • list Topic - run this command to list a kafka topic, kafka-topic.bat --bootstrap-server localhost:9092 --list
    • describe a Topic - run this command to describe a kafka topic, kafka-topic.bat --bootstrap-server localhost:9092 --describe --topic order-topic
    • publish message from producer to consumer - Open another instance of the command prompt and then run this to publish message from producer to consumer kafka-console-producer.bat --broker-list localhost:9092 --topic order-topic
    • subscribe/consume message from producer - Open another instance of the command prompt and then run this to subscribe to message from the producer kafka-console-consumer.bat --bootstrap-server localhost:9092 --topic order-topic --from-beginning

Implementing the Practical Example

Let's implement a scenario where an Order Service processes orders, and an Inventory Service listens to order events to update stock levels.

Create Spring Boot project

  • Head over to Spring Starter template
  • Create a Spring Boot Kafka Producer (Order Service)
  • add the dependency spring web and kafka
 <dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
</dependency>
Enter fullscreen mode Exit fullscreen mode

Image description

Create a Kafka Producer:

import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;

@Service
public class OrderProducer {
    private final KafkaTemplate<String, String> kafkaTemplate;

    public OrderProducer(KafkaTemplate<String, String> kafkaTemplate) {
        this.kafkaTemplate = kafkaTemplate;
    }

    public void sendOrderEvent(String message) {
        kafkaTemplate.send("order-topic", message);
    }
}
Enter fullscreen mode Exit fullscreen mode

Expose an API to trigger order processing:

import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("/orders")
public class OrderController {
    private final OrderProducer orderProducer;

    public OrderController(OrderProducer orderProducer) {
        this.orderProducer = orderProducer;
    }

    @PostMapping("/create")
    public String createOrder(@RequestBody String orderDetails) {
        orderProducer.sendOrderEvent(orderDetails);
        return "Order event sent!";
    }
}
Enter fullscreen mode Exit fullscreen mode

2. Create a Spring Boot Kafka Consumer (Inventory Service)

Create a Kafka Consumer:

import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;

@Service
public class InventoryConsumer {
    @KafkaListener(topics = "order-topic", groupId = "inventory-group")
    public void listenOrderEvents(ConsumerRecord<String, String> record) {
        System.out.println("Received Order Event: " + record.value());
        updateInventory(record.value());
    }

    private void updateInventory(String orderDetails) {
        // Simulate inventory update logic
        System.out.println("Inventory updated for order: " + orderDetails);
    }
}
Enter fullscreen mode Exit fullscreen mode

Running the Application

  1. Start Kafka and Zookeeper (if not already running).
  2. Run the Order Service: Start the producer microservice.
  3. Run the Inventory Service: Start the consumer microservice.
  4. Trigger an Order Event:
   curl -X POST http://localhost:8080/orders/create -H "Content-Type: application/json" -d '"{"orderId": "123", "productId": "456", "quantity": "2"}"'
Enter fullscreen mode Exit fullscreen mode
  1. Check the Consumer Logs: You should see the order event being processed and the inventory updated.

Conclusion

Event-driven microservices with Apache Kafka provide a powerful way to build decoupled, scalable systems. In this guide, we demonstrated how to implement a simple order processing system using Kafka producers and consumers in Spring Boot.

By adopting event-driven design, you can create robust, loosely coupled microservices that scale efficiently while ensuring smooth communication between components.

Please if you find the article interesting and valuable drop a like and comment. See you in the next one, keep building!
Follow me on socials my Linkedin Handle

Top comments (1)

Collapse
 
nullcareexception profile image
Patrick

ZooKeeper is marked as deprecated since the 3.5.0 release. ZooKeeper is planned to be removed in Apache Kafka 4.0. So why are you still using it?