Have you ever had a backend system that needed to perform time-consuming tasks, such as sending emails, processing data, or generating reports?
These tasks can take a long time to complete and tie up server resources, which can lead to slow response times and a poor user experience.
This is where task queuing comes in. Task queuing is the practice of offloading time-consuming tasks to a separate system, where they can be executed asynchronously in the background. This frees up server resources and allows your web application or backend system to continue responding quickly to user requests.
Celery is a popular Python library for task queuing that makes it easy to set up and manage a distributed task queue. It provides a simple yet powerful way to manage the execution of asynchronous tasks and can integrate with a wide variety of other Python libraries and frameworks.
In this article, we'll discuss what Celery is and write a simple task queue system using Celery & Django.
Let's jump into it!
What is Task Queuing
Task queuing is a concept that has been around for many years, and it has become an important technique for managing large-scale distributed systems. In a distributed system, there are many tasks that need to be performed, and these tasks can be time-consuming and require a lot of resources. By using task queuing, tasks can be distributed across multiple workers or servers, allowing them to be executed concurrently and efficiently.
One of the benefits of using a task queuing system is that it can improve the performance and scalability of your system. For example, if you have a web application that needs to generate a report for each user, you can use a task queue to distribute the report generation tasks across multiple workers. This can significantly reduce the time it takes to generate reports and make your application more responsive.
Task queuing also provides a way to handle errors and retries. If a task fails, it can be retried automatically, ensuring that it eventually completes successfully. This can be particularly useful when working with unreliable resources or when performing complex calculations that may fail due to errors or resource constraints.
What is Celery & How it Works
Celery is a popular Python-based task queuing system that has been around since 2009. It is designed to be easy to use, flexible, and scalable, making it a popular choice for both small and large-scale applications.
Celery works by using a combination of a message broker and a worker pool. The message broker is responsible for storing the tasks and messages that are sent between the workers, while the worker pool is responsible for executing the tasks.
When you define a task in Celery, you create a Python function and decorate it with the @celery.task decorator. When this function is called, Celery adds the task to the message broker, which can then be picked up by a worker and executed. Celery supports a variety of message brokers, including RabbitMQ, Redis, and Amazon SQS, allowing you to choose the one that best suits your needs.
Celery also provides a variety of features for monitoring and managing your tasks. For example, you can configure the maximum number of retries for a task, set a time limit for how long a task can run, and monitor the progress of your tasks using Celery's built-in monitoring tools or third-party tools like Flower.
By using Celery, you can greatly improve the performance and scalability of your Python applications and ensure that time-consuming tasks are executed efficiently and reliably in the background.
Hands-on Task Queuing using Django & Celery
In section, we'll create a simple task queue using Django, Celery & RabbitMQ. Note that we need RabbitMQ as a broker to send tasks from Django to Celery.
If you don't have RabbitMQ installed you can install from here and make sure that that RabbitMQ service is running.
First, create an empty folder, create a virtual env named venv and activate it:
python -m venv venv
venv\Scripts\activate
Note that you might need another command to activate the venv if you are using Mac.
Next, install Django & Celery:
pip install django celery
After that, create a new Django project:
django-admin startproject main .
In the main directory of the main project create a file named
celery.py and add the following code:
main/celery.py
import os
from __future__ import absolute_import, unicode_literals
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'main.settings')
app = Celery('main')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
The previous code sets up a Celery instance with configurations defined in Django settings file and enables Celery to discover and execute tasks defined in different Django apps:
Imports the os module which provides a way of using operating system dependent functionality like reading or writing to the file system.
Imports tools that allows importing of unicode literals in the code.
Imports the Celery module.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'main.settings')
This sets theDJANGO_SETTINGS_MODULE
environment variable tomain.settings
, which is the location of the Django settings file.Creates an instance of the Celery object.
Loads the Celery configuration from the Django settings file, specified by the argument. This is done to make sure that Celery settings are separate from other Django settings.
Discovers and imports all of the tasks defined in the tasks.py files within each installed Django app. It enables the Celery app to find and execute tasks defined in different Django apps.
Now let's create a new Django app inside the main project:
python manage.py startapp app1
Before we forget, let's register this app into the main Django project:
main/settings.py
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'app1'
]
Inside the app directory, let's create a python file tasks.py, which contains the tasks:
app1/tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
@shared_task
def add(x, y):
return x + y
Imports tools that allows importing of unicode literals in the code.
Imports the shared_task decorator from the Celery module, which is used to create a task that can be executed asynchronously using Celery.
Creates a simple task function which adds 2 values. We also use the
@shared_task
decorator to wrap this function and turn it into an async Celery task.
Finally, let's run the celery queue:
Windows: celery -A proj worker -l info --pool=solo
Mac/Linux: celery -A main worker --loglevel=info
You will see something similar to this:
Now let's trigger the task from the Django shell, open a new terminal and add the following code:
from app1.tasks import add
add.delay(4, 4)
add.apply_async((3, 3), countdown=20)
We import the add function which will trigger task to Celery via RabbitMQ.
We use the function to add 2 values.
We use the same function but we wait 20 seconds before we deliver the message to Celery.
If we take a look RabbitMQ interface we'll see that it had received the 2 messages:
Here is out result:
You can find the code used here.
Conclusion
In summary, Celery is a powerful and flexible task queuing system for Python that can help you manage and distribute tasks across multiple workers or servers. By using Celery, you can improve the performance and scalability of your applications, handle errors and retries, and monitor and manage your tasks with ease.
Top comments (0)