DEV Community

Cover image for Choosing Redis Caching Over Kafka for a Grocery Delivery App
DivyanshuLohani
DivyanshuLohani

Posted on

Choosing Redis Caching Over Kafka for a Grocery Delivery App

Why Consider Kafka for the App?

As a Django developer, my experience primarily lies in developing monolithic services. With a limited user base, I never felt the need to explore microservices. However, when designing scalable systems, Kafka often emerges as a preferred solution due to its ability to handle high-throughput data pipelines and enable seamless communication in distributed architectures.

Monolithic Vs Microservices

Monolithic and Microservice Architectures

Is a Monolithic Architecture Sufficient for This App?

Yes, a monolithic architecture is sufficient for this app. The target user base is expected to generate only a few hundred thousand requests, which does not demand the complexity of microservices. A monolithic setup ensures simplicity, easier maintenance, and faster development—all key factors for a project of this scale.

Why Not Use Kafka?

Kafka is incredibly powerful for microservice architectures. Since my app doesn't require event-based handling logic, I cannot utilize the full potential of it. Enabling clients to connect via WebSockets, listen for events, and respond in real-time. My app relies on polling to retrieve data from the server, making Kafka’s event-driven capabilities unnecessary for this use case. Additionally, introducing Kafka would have added complexity that the app did not require at this stage, such as managing producers and consumers.

Event Driven System

Why I Choose Django Caching and Redis?

Django’s Caching System

Django’s built-in caching framework is robust and ready to use with libraries like django-redis-caching. It integrates seamlessly with Django views and ORM, making it ideal for improving performance. Specifically, it helps optimize:

  • Product Data: Frequently accessed product details are cached for faster retrieval.
  • Product Listings: Category pages and search results load faster by reducing database queries.
  • Driver Location Updation: I can seamlessly store the location data in the cache without ever talking to the database.

How Is Redis Used for Caching?

Redis was chosen as the caching backend for its speed and efficiency. Here’s how it’s used:

  • Order Data Storage: Order IDs act as keys, and values contain all relevant order data such as the order ID, location of the order, and the driver location (if assigned).
  • Reduced Database Load: By caching frequent queries, Redis minimizes the need for repeated database reads and writes.
  • Enhanced Performance: Data is served directly from Redis, improving response times and user experience.

Redis Ussage

What About Future Scalability?

While Redis and Django caching meet the current needs, the system is designed to scale as the user base grows. Future enhancements could include:

  • Django Channels: Adding real-time communication features using WebSockets.
  • Background Workers: Employing tools like Celery for handling background tasks.

Is Kafka Still Relevant?

Kafka is a robust and scalable solution tailored for large-scale, event-driven systems and microservice architectures. It excels at managing high-throughput data pipelines and ensures reliable communication across distributed systems. For this grocery delivery app, however, Redis was chosen due to its simplicity and alignment with the project’s requirements, which focus on faster development and efficient caching in a monolithic setup. If the app's user base grows significantly or its architecture shifts towards microservices, transitioning to Kafka or adopting a hybrid solution can be explored to handle increased complexity and scale effectively.

Conclusion

This case study demonstrates why Redis caching was chosen over Kafka for the grocery delivery app. The decision was driven by the app’s scale, monolithic architecture, and performance needs, ensuring a seamless and efficient experience for users without unnecessary complexity.

Top comments (0)