DEV Community

Mehmet Ali Tilgen
Mehmet Ali Tilgen

Posted on

What is Celery?

Image description

HTTP, as it is known, is a protocol based on a request-response loop between client and server. When developing web applications, managing this loop in the most efficient way possible is a critical goal. The ideal is to optimize the interaction by providing the user with a meaningful and accurate response as soon as possible. However, this request-response cycle does not always run smoothly; users sometimes experience delays, incorrect responses, or errors due to system load. This is where task queuing tools like Celery come in, helping to increase performance and improve user experience in web applications by managing busy background processes.

To understand this better, let’s take a simple example. Let’s say we have a function that analyzes the photos uploaded by users and then sends the analysis results via email:

def photo_analysis_view(request):
    user = request.user
    photo_analysis = analyize_photo(user=user)
    send_photo_analysis_email(photo_analysis=photo_analysis, user=user)
    return JsonResponse({"message": "Your photo analyis has been sent your email."})

Enter fullscreen mode Exit fullscreen mode

When we analyze the execution of this function, we need to remember that Python processes code line by line. So, function send_photo_analysis_email will not run until function analyze_photo is complete. Let’s assume that this function send_photo_analysis_email takes 5 minutes on average. In this case, users would have to wait for 5 minutes for their browser to respond, resulting in a very bad user experience.

This is where Celery comes in.

What is Celery?
Celery is a popular open source library that enables asynchronous task management and processing in Python programs. It is used to efficiently handle long-running or workload-intensive tasks and parallel processing. The main goal of Celery is to distribute the workload to specific workers to prevent the main program from waiting and improve system performance. In this way, we can execute time-consuming operations in the background that are not directly part of the request-response cycle, and we can perform these operations discrete and concurrently by assigning them to one or more workers.

Celery Architecture and Operating Principle.
Celery’s architecture, as an implementation of a distributed messaging system, consists of three basic components. These components are a producer or application that creates and sends messages, a broker or queue that forwards and stores messages bidirectionally, and consumers or workers that perform specific operations based on the received messages.

Image description

This architecture plays an important role in managing and distributing tasks in web applications. The message sender (producer) is usually a web application or a block of code that implements a specific business logic. This application generates the workload as messages and adds these messages to a queue through a broker (usually Redis or RabbitMQ). The broker reliably stores these messages and forwards them to the appropriate consumers (workers) for processing.

Workers are independent units that receive these messages and perform specified operations in the background. This can be time-consuming tasks such as sending emails, processing data, analyzing large files or interacting with external APIs. Workers can process multiple messages at the same time, allowing them to manage the workload simultaneously.

The interaction between these components allows the workload to be managed and distributed in a centralized way. Thus, the main application can respond quickly to user requests, while the complex and intensive work in the background is handled by the workers. With this structure, Celery offers both a scalable and flexible solution, providing great advantages in terms of performance optimization and user experience in web applications.

Let’s show a step-by-step demo to turn the photo_analysis_view function into a background task using Celery.

A Simple Demo

In this demo, we will run a function in the background of a Django web application that analyzes photos uploaded by users and sends the analysis results via email. We will improve the user experience by running long email sending processes in the background.

Redis Setup

First, Celery needs a broker. We run Redis with Docker:

$ docker container run --rm -p 7055:6379 -d --name celery_demo_broker redis:alpine

Enter fullscreen mode Exit fullscreen mode

This command will run Redis over port 7055.

Installing Python Dependencies

$ pip install "celery[redis]"

Enter fullscreen mode Exit fullscreen mode

Celery Configuration and Defining Tasks

We do our Celery configuration by creating the main.pyfile:

from celery import Celery

app = Celery('celery_demo', broker='redis://localhost:7055/0')

@app.task
def send_photo_analysis_email_task(photo_analysis, user):
    print(f"Sending email to {user.email} with analysis result: {photo_analysis}")
    return f"Email sent to {user.email} with analysis result."
Enter fullscreen mode Exit fullscreen mode

Updating the Django View

In your Django application, you can update your photo analysis function to run in the background. For this update, we will call the task we defined from
Celery:

from django.http import JsonResponse
from .tasks import send_photo_analysis_email_task

def photo_analysis_view(request):
    user = request.user
    photo_analysis = analyze_photo(user=user)  
    send_photo_analysis_email_task.delay(photo_analysis=photo_analysis, user=user)
    return JsonResponse({"message": "Your photo analysis has been sent to your email."})
Enter fullscreen mode Exit fullscreen mode

Starting Celery Worker

Celery workers are independent processes that take the tasks you define and run them. To start a worker, we use the following command:

$ celery --app main.app worker --loglevel=info

Enter fullscreen mode Exit fullscreen mode

This command starts the worker using the Celery application named app in main.py. This worker listens to the tasks sent to Redis and takes care of them.
In this demo, we saw how to manage a time-consuming email sending process in the background using Django and Celery. Redis is used as the broker and Celery workers process these tasks in the background, allowing the main application to return a quick response to the user.

We have explained celery, which has gained a place in the Python world. Don’t forget to follow our articles.

Top comments (0)