Asynchronous tasks in Django with Django Q

Learn how to use Django Q, the task queue, with the Redis broker for offloading long running tasks in your Django applications.

Asynchronous tasks in Django with Django Q


To follow along you’ll need:

  • an Heroku account if you want to use their Redis add-on
  • the Heroku CLI installed on your system
  • a newer version of Python, ideally 3.6 or 3.7
  • Git

Deployment on Heroku is optional, and you can use your own Redis instance if you’ve already got one locally.

Setting up the project

And now let’s get to work! To start off we’re going to create a new Python virtual environment alongside with a Django installation:

mkdir django-q-django && cd $_
python3 -m venv venv
source venv/bin/activate
pip install django

Next up we’re going to create a new Django project from a template:

django-admin startproject \
    --template \
    --name=Procfile \
    --extension=py,example django_q_django .

If you’re wondering what I’m doing here, this is a template of mine. I’ve got a link in the resources with a tutorial for creating your own Django project template.

Now let’s install the dependencies with pip:

pip install -r ./requirements/dev.txt

We need also to provide some environment variables for our project:

mv .env.example .env

and finally we’re going to run Django migrations:

python makemigrations
python migrate

at this point you should be able to run the development server:

python runserver

Now before moving to Django Q let’s see what problem is it meant to solve.

Asynchronous tasks in Django with Django Q: the problem with synchronous code

The main issue for Python and Django is that they’re synchronous. It’s not a bad thing per se, and there are a lot of ways to circumvent it.

Python, on which Django builds on, is single threaded by nature. Single threaded means that the language interpreter can only run your code in sequence.

The practical implication is that any view in a Django application can get stuck if one or more operations take too much to complete.

To demonstrate the concept let’s create a new Django application inside our project:

django-admin startapp demo_app

In this app we’re going to define a view which returns a simple JSON response:

# demo_app/

from django.http import JsonResponse

def index(request):
    json_payload = {
        "message": "Hello world!"
    return JsonResponse(json_payload)

And let’s also create the corresponding url:

# demo_app/

from django.urls import path
from .views import index

urlpatterns = [
    path("demo-app/", index)

Don’t forget to wire up the url for the new app:

# django_q_django/

from django.contrib import admin
from django.urls import path, include
from .settings.base import ADMIN_URL

urlpatterns = [
    # the new url
    path("", include("demo_app.urls"))

And finally activate the app:

# django_q_django/settings/

    # omitted for brevity

Now to simulate a blocking event in the view we’re going to use sleep from the time module, part of the Python standard library:

from django.http import JsonResponse
from time import sleep

def index(request):
    json_payload = {
        "message": "Hello world!"
    return JsonResponse(json_payload)

Run the development server, head over and you can see the view hanging for 10 seconds before returning to the user.

Now, this is a delay created on purpose, but in a real application the block could happen for a number of reasons:

  • I/O operations taking too long
  • network delay
  • interactions with the file system

Even if it’s a contrived example you can see why it’s crucial to offload long running tasks in a web application.

Django Q was born with this goal in mind. In the next sections we’ll finally put our hands on it.

If you like watching videos here’s the first part of this tutorial:

Wait, how about asynchronous Django?

We’re left with a Django project, a Django application, and a view that remains stuck for 10 seconds!

It’s not that Django and Python don’t scale. There are a number of ways to get around single threading.

For Python there is asyncio. Django instead moved to async only recently, the implementation is in its infancy and still there’s no support for async views.

Things are going to change in the future, I suggest staying tuned on Andrew Godwin because it’s the lead person for the async story in Django.

But the need for third party queues won’t go away anytime soon even when Django will go 100% async. Still a nice skill to have.

Preparing the Heroku app and the Redis instance

In this section we’ll prepare the Heroku project. I’m using Heroku here because you may want to deploy to production later, and also because they offer the Redis add-on for free.

If you’re new to Redis, it’s an in-memory database, can be used as a cache and as a message broker.

A message broker is more or less like a post office box: it takes messages, holds them in a queue, and folks from around the city can retrieve these messages later.

If you’re interested in how Django Q uses brokers check out this page.

Still in the project folder initialize a Git repo:

git init

Then create a new Heroku app. I’m going to add two add-ons:

  • heroku-postgresql which is more robust than the default sqlite for production
  • heroku-redis which will give us the Redis instance

If you haven’t got the Heroku CLI and an Heroku account go create one, install the CLI and come back later.

Otherwise follow along with me and create the app:

heroku create --addons=heroku-postgresql,heroku-redis

Once done give Heroku a couple of minutes and then run:

heroku config:get REDIS_URL

This command will reveal REDIS_URL, an environment variable with the credentials for the Redis instance.

Take note of it and head over the next section!

Asynchronous tasks in Django with Django Q: installing and running Django Q

Let’s install Django Q and the Redis client library (the client is needed by the Redis broker for Django Q):

pip install django-q redis

Once done activate Django Q in the list of installed apps:

    # omit
    # add Django Q

Now reveal the Redis Heroku credentials:

heroku config:get REDIS_URL

You should see a string like this:


Before the @ you’ll find the password:


After the @ there’s the host:

And 9059 is the port. Note that the credentials will be different for you, don’t use mine!

(Needless to say, by the time you read this article these credentials will be gone.)

Now configure Django Q in django_q_django/settings/ Fill host, port, and password with your credentials:

    'name': 'django_q_django',
    'workers': 8,
    'recycle': 500,
    'timeout': 60,
    'compress': True,
    'save_limit': 250,
    'queue_limit': 500,
    'cpu_affinity': 1,
    'label': 'Django Q',
    'redis': {
        'host': '',
        'port': 9059,
        'password': 'p948710311f252a334c3b21cabe0bd63f943f68f0824cd41932781e7793c785bf',
        'db': 0, }

You might wonder why I’m not using REDIS_URL as it is. The reason is that Django Q wants credentials in a dictionary.

I didn’t have time to check if is the Python Redis client imposing this limitation, maybe I’ll write a patch for both in the future. It was a limitation of Django Q, hope I’ll have time to open a PR I opened a pull request which got merged, and now you can use a Redis url:

    'name': 'django_q_django',
    # omitted for brevity  
    'label': 'Django Q',
    'redis': 'redis://'

(When running the project in production you may want to switch to using environment variables. See the base configuration for learning how to use env).

Once you’re done run the migrations (Django Q needs to create its tables in the database):

python migrate

At this point you’re ready to run the Django Q cluster with:

python qcluster

If everything goes well you should see this:

Django Q cluster
The cluster started and connected to Redis

Well done! In the next section we’ll create our first asynchronous task.

What is a Django Q cluster? Check this out.

If you like watching videos here’s the second part of this tutorial:

Asynchronous tasks in Django with Django Q: async_task

Worth doing a quick recap of what we covered so far:

  • we created a Django project
  • we created a Django application
  • we installed Django Q and the Redis client
  • we created an Heroku project and a Redis instance
  • finally we configured Django Q

To test that Django Q could connect to Redis I launched:

python qcluster

With the project in place let’s finally see an example of Django Q in action. Remember your view?

# demo_app/

from django.http import JsonResponse
from time import sleep

def index(request):
    json_payload = {
        "message": "Hello world!"
    return JsonResponse(json_payload)

Remove the time import and create a new file in demo_app/ (the name of this file is totally up to you).

In this new module we’re going to define a function, sleep_and_print:

# demo_app/

from time import sleep

def sleep_and_print(secs):
    print("Task ran!")

In the view instead we’ll borrow async_task from Django Q:

from django.http import JsonResponse
from django_q.tasks import async_task

def index(request):
    json_payload = {
        "message": "hello world!"
    return JsonResponse(json_payload)

async_task is the principal function you’ll use with Django Q. It takes at least one argument, the function’s module that you want to enqueue:

# example


The second group of arguments instead is any argument that the function is supposed to take. sleep_and_print in our example takes one argument, the seconds to wait before printing. That means for async_task:

# example

async_task("", 10)

That’s enough to enqueue a task. Let’s now mix our view with async_task.

Asynchronous tasks in Django with Django Q: enqueue your first task

Back to our view, with async_task imported, call it right after the return statement:

from django.http import JsonResponse
from django_q.tasks import async_task

def index(request):
    json_payload = {"message": "hello world!"}
    # enqueue the task
    async_task("", 10)
    return JsonResponse(json_payload)

Now run the cluster:

python qcluster

Run the Django server:

python runserver

And finally make a call to your view, either from or from the terminal:


Now you should notice a couple of things. The Django dev server should log:

13:55:42 [Q] INFO Enqueued 1

The Django Q cluster should log something along these lines:

13:55:42 [Q] INFO Process-1:1 processing [juliet-mountain-august-alaska]

And after that you should see:

Task ran!

Here’s my terminal:

Asynchronous tasks in Django with Django Q: enqueue your first task

What happened here is that:

  1. the Django view responded immediately to the request
  2. Django Q saved the task (just a reference) in Redis
  3. Django Q ran the task

With this “architecture” the view does not remain stuck anymore. Brilliant.

Think about the use cases for this pattern. You can:

  • safely interact with the I/O
  • crunch data in the background
  • safely move out API calls from your views

and much more.

And here’s the third and last part of the video. It covers async_task hooks and an example of API call as well. Enjoy!

Asynchronous tasks in Django with Django Q: what’s next?

In addition to async_task Django Q has the ability to schedule a task. A practical use case is do X every X days, much like a cron job. Check the documentation to learn more.

Django Q supports other brokers in addition to Redis. Again, the docs are your friend.

Another neat feature of Django Q is the admin integration. Create a super user for your Django project, log in into admin, and you’ll find all your tasks and schedules there.

If you don’t need other brokers than Redis, django-rq might be a lightweight alternative to Django Q.

Asynchronous tasks in Django with Django Q: why not Celery?

Fun fact: Celery was created by a friend of mine. We were in high school together. Despite that I don’t have much experience with Celery itself, but I always heard a lot of people complaining about it.

Check this out for a better perspective.

BONUS. Asynchronous tasks in Django with Django Q: deploy to Heroku

Note: these instructions assume you’re using the same Django project template from the beginning.

Note: keep in mind that free Heroku apps sleep after inactivity. Your cluster might stop running.

Before deploying to Heroku make sure to add a worker in the Heroku Procfile for starting the cluster:

release: python migrate
worker: python qcluster
web: gunicorn -w 3 django_q_django.wsgi --log-file -

Next up update the requirements in requirements/base.txt, add django-q and redis:

echo -e '\n' >> requirements/base.txt
pip freeze | egrep -E 'django-q|^redis' >> requirements/base.txt

Copy to Add the environment variables for Redis to keep things cleaner in production (these are the same credentials you got from Heroku):


Now configure Django Q in django_q_django/settings/ to use the environment variables:

    "name": "django_q_django",
    "workers": 8,
    "recycle": 500,
    "timeout": 60,
    "compress": True,
    "save_limit": 250,
    "queue_limit": 500,
    "cpu_affinity": 1,
    "label": "Django Q",
    "redis": {
        "host": env.str("REDIS_HOST", "default host"),
        "port":"REDIS_PORT", 6379),
        "password": env.str("REDIS_PASSWORD", "default password"),
        "db": 0,

To set the new environment variables on Heroku run:


Wait for the result. If everything goes well commit all the things and push to Heroku:

git add .
git commit -m "Deploying Django to Heroku"
git push heroku master

Wait for the deployment. Then enable the worker on the app with:

heroku ps:scale worker=1

This will enable the worker capability and the Django Q cluster will start. To confirm that everything works run:

heroku logs -t

This opens the logs for your app. Now in another terminal run:


In your terminal you should see something along these lines:

Asynchronous tasks in Django with Django Q: deploy to Heroku

It’s a sign that the view is responding fine and the asynchronous task ran successfully.

Congrats on finishing this tutorial!

Thanks for reading and stay tuned!


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.