As I mentioned in my earlier post, I was working on writing a blog about linking docker containers running Django and PostgreSQL.
Docker container linking is not a new thing. It’s been used widely by a lot of people out there. However, when I wanted to link a Django and PostgreSQL container, I could only find posts that explained the linking using some existing Djanngo code.
What I was actually looking for was the modifications to be made to
settings.py, way to correctly run a postgres container, and then link them correctly. Since I couldn’t find such a post, I wrote one myself in the hope of it being useful for someone who is same situation as I was.
In case you don’t want to walkthrough the code and directly want to test Docker container linking with Django and PostgreSQL by pulling images from Docker Hub, click here.
This post doesn’t have any boilerplate code and walks you through steps right from creating a Django project to creating the containers and then linking them in the end.
First we will create a Django project and see if things work well on the local system. Follow below steps to create a Django project and an app:
docker_links/settings.py to have below information:
Make sure that
DATABASES dictionary has settings that reflect configuration on your system!
To ensure that requests to api are redirected to api app, modify
api/models.py have very basic code like below:
And apply migrations:
Next, have below code in
That is it. You can now test
GET requests by running the Django dev server:
I prefer using Postman REST client to test REST APIs.
If everything works well, make sure to have a
requirements.txt file created using below command:
requirements.txt could be in the same directory as
Now that we have everything working on the host which has Django and PostgreSQL configured on it, we will move towards dockerizing the setup and link the containers running Django and PostgreSQL.
Create a Dockerfile like below:
FROM ubuntu:latest RUN apt-get update && apt-get upgrade RUN apt-get -y install python-pip RUN apt-get install python-dev postgresql-server-dev-all -y RUN mkdir /code && cd /code WORKDIR /code ADD docker_links /code RUN pip install -r requirements.txt
and build an image out of it:
Alternatively, you can also download image created using above Dockerfile from Docker Hub.
Going through Dockerfile line by line,
python-pip package is required to install the project dependencies from earlier created
postgresql-server-dev-all packages are required to make sure that
psycopg2 has its dependencies preinstalled. Then we copy the code from host system to the container and install dependencies from
Start the PostgreSQL container:
Create a database called
docker_links in the postgres container:
Above command should work without specifying host with
-h switch. I need to figure it out.
If you’re going to pull image from Docker Hub,
Create a throwaway container to apply migrations from our Django app:
And finally create a container that would serve our REST APIs:
You can now test API on localhost using the Postman REST client.
For any suggestions on improving above workflow, please ping me on twitter.