Django with Docker and Celery
How to containerize a Django application with Docker and Celery
This post is basically a mixture of these two post from testdriven.io; Dockerizing Django with Postgres, Gunicorn, and Nginx and Handling Periodic Tasks in Django with Celery and Docker. You should check them out if you want more details on what is happening.
In a previous post we saw how to use docker to containerize a small Django application. In this post we will add Celery into the mix. The code files for this post can be found at demo_django_docker_celery .
In order to improve user experience, long-running processes should be run outside the normal HTTP request/response flow in a background process. Thus, the goal of this post is to develop a Django application that works in conjunction with Celery to handle long-running processes outside the normal request/response cycle. In particular, we will develop an application with the following cycle:
- The user starts a new task via a POST request to the server
- The server creates a new a task that is added to the queue and returns the tak's id back to the client.
- The client continues to poll the server to check the status of the task while the task itself is running in the background.
To get started, let's check the docker/docker-compose
versions.
docker --version
docker-compose --version
On my machine, these two return with the following information
Docker version 20.10.10, build b485636
docker-compose version 1.25.0, build unknown
We will extend the Django project from the previous post by adding a new application called my_app
. The app
directory structure now looks like the following
app/
Dockerfile
hello_world_django
manage.py
my_app
requirements.txt
templates
That is we added two new directories my_app
and templates
. Furthermore, we need to update the requirements.txt
file to
Django==3.0.7
mysqlclient==2.0.3
redis==3.5.3
celery==4.4.7
amqp==2.6.1
The next changes concern docker-compose.yml
file in order to account for the new containers. This is shown below
version: '3.3'
services:
web:
build: ./app
container_name: my_django_app_django_container
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:8000
env_file:
- ./.env.dev
depends_on:
- db
db:
image: mysql:5.7
container_name: my_django_app_mysql_container
ports:
- '3306:3306'
environment:
MYSQL_DB: 'django_app_demo'
MYSQL_PASSWORD: 'password'
MYSQL_ROOT_PASSWORD: 'password'
celery:
container_name: my_django_app_celery_container
build: ./app
command: celery worker --app=hello_world_django --loglevel=info
volumes:
- ./app/:/usr/src/app/
env_file:
- ./.env.dev
depends_on:
- web
- redis
redis:
image: redis:6-alpine
container_name: my_django_app_redis_container
Finally, we need to update the .env.dev
file
DEBUG=1
SECRET_KEY=foo
DJANGO_ALLOWED_HOSTS=127.0.0.1 0.0.0.0 localhost [::1]
SQL_ENGINE=django.db.backends.mysql
SQL_DATABASE=django_app_demo
SQL_USER=root
SQL_PASSWORD=password
SQL_HOST=db
SQL_PORT=3306
CELERY_BROKER_URL=redis://redis:6379/0
CELERY_BACKEND_URL=redis://redis:6379/0
Notice that the SQL_HOST
has the name of the MySQL service. Also we need to postfix with _URL
the variables related to Celery. We can now build and run the containers
docker-compose up --build
However, the above will fail as we haven't created yet the database. To do so we can log on the MySQL container using
sudo docker exec -it my_django_app_mysql_container /bin/bash
whilst in the container, issue
mysql -u root -p
In order to access the MySQL server running on the container and create the data base django_app_demo
and grant access to the user root
. Once this is done, we need to run the Django migrations. In order to do so, log on the web application container
docker exec -it my_django_app_django_container /bin/bash
We are now ready to go. Navigate at http://127.0.0.1:8000/ to view the application. Launch a task and poll the application for the result. A container's logs can be viewed with
docker my_django_app_django_container logs