/blog/handling-asynchronous-tasks-in-django-with-celery-and-redis/ - zsh
user@portfolio ~ $

cat handling-asynchronous-tasks-in-django-with-celery-and-redis.md

Handling Asynchronous Tasks in Django with Celery and Redis

Author: Aslany Rahim Published: November 27, 2025
Don't let your users wait. Learn how to offload time-consuming tasks like email sending and image processing to background workers using Celery and Redis.

Speed is a feature. In a standard Django Request-Response cycle, the user has to wait for the server to finish everything before the page loads. If your view sends a Welcome Email, generates a PDF report, or resizes an uploaded image, the user sees a spinning loading icon.

If that process takes 10 seconds, the user leaves.

To solve this, we use Asynchronous Task Queues. The web server handles the request immediately and says "I'll do this later," passing the heavy job to a background worker. The industry standard for this in Python is Celery with Redis.

The Architecture

  1. Django (Producer): Adds a task (e.g., "Send Email") to the Queue.
  2. Redis (Broker): A fast in-memory store that holds the list of tasks to be done.
  3. Celery (Consumer): A separate worker process that watches Redis, picks up tasks, and executes them.

Step 1: Installation

You need a Redis server running (use Docker for local dev) and the Python packages.

pip install celery redis

Step 2: Configuring Celery

Create a file named celery.py inside your main project folder (next to settings.py).

import os
from celery import Celery

# Set the default Django settings module
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myproject')

# Load task modules from all registered Django app configs.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Auto-discover tasks in all installed apps
app.autodiscover_tasks()

In your __init__.py in the same folder, load the app so it starts when Django starts:

from .celery import app as celery_app

__all__ = ('celery_app',)

Then, add the Redis URL to your settings.py:

CELERY_BROKER_URL = 'redis://localhost:6379/0'

Step 3: Creating a Task

In any of your Django apps, create a tasks.py file.

from celery import shared_task
from django.core.mail import send_mail

@shared_task
def send_welcome_email_task(user_email):
    send_mail(
        'Welcome!',
        'Thanks for signing up.',
        '[email protected]',
        [user_email],
        fail_silently=False,
    )
    return "Email Sent"

Step 4: Calling the Task

This is the magic part. Instead of calling the function normally, you call .delay().

# views.py
from .tasks import send_welcome_email_task

def register(request):
    # ... registration logic ...

    # This returns immediately. The user does not wait for the email.
    send_welcome_email_task.delay(user.email)

    return HttpResponse("Registration successful!")

Running the Worker

For this to work, you need two terminal windows open.

Terminal 1 (Django):

python manage.py runserver

Terminal 2 (Celery Worker):

celery -A myproject worker --loglevel=info

Conclusion

By moving heavy lifting to the background, you keep your application snappy and responsive. Celery handles retries, scheduling (cron jobs), and distributing work across multiple servers, making it an essential tool for scaling Django.

19 views
0 comments

Comments (0)

Leave a Comment

No comments yet. Be the first to comment!

Related Posts

Deploying Django to Production: Nginx and Gunicorn

The runserver command is not for production! Learn how to set up a robust production server using Gunicorn as the …

November 29, 2025

Protecting Your Secrets: Using Python Decouple in Django

Hardcoding API keys in your settings file is a security recipe for disaster. Learn how to use python-decouple to manage …

November 26, 2025

Decoupling Logic with Django Signals: The Observer Pattern

Learn how to use Django Signals to decouple your application logic. We explore the classic use case of auto-creating User …

November 25, 2025

user@portfolio ~ $ _