A Guide to Setup Centralized Logging in Django


Sripathi Krishnan

29 May 2017

A Guide to Setup Centralized Logging in Django

Logging is one of the most crucial parts of any application. Proper logging can save countless hours and mind hassles that a developer and support team might face while debugging. Not only the developers but also the system admins and security team have a lot to gain from logging. A well-designed logging system is an important asset for any project. Django’s built-in logging system makes it very easy to set up logging for your application.


Django applications can easily distribute logs to multiple destinations. In an enterprise setup, it’s likely that your logs would be spread across multiple systems and you’d need to login to each of them to pull the logs.


This is where centralized logging shines. Following are the pros of using centralized logging:

  1. All the log data is stored in a single place, no need of pulling data from multiple servers.
  2. Log data can be searched, exported, archived without affecting the original log files.
  3. Log data can be parsed, this helps when your log data is received in multiple formats like JSON, XML, Plain Text, etc.

How to setup?

You can use a lot of services to get your logs. For the scope of this blog we’ll be focusing on:

  1. Logentries
  2. ELK Stack (Elasticsearch, Logstash, and Kibana)

Using Logentries

  1. Create a Logentries Account
  2. Choose Python in Libraries
Python Librarie
Choose Python Libraries
  1. Create a log set
Create a Log Set
Create a Log Set
  1. From pip install the python logging package for Logentries.
  2. Generate a log token
Generate a Log Token
Generate a Log Token

In your settings.py file(preferrably production settings) add the following:

LOGENTRIES_TOKEN = 'your-logentries-token'
    # ...
    'handlers': {
        # ...
        'logentries': {
            'level': 'INFO',
            'token': LOGENTRIES_TOKEN,
            'class': 'logentries.LogentriesHandler',
'logger': {
        # ...
        'app_name': {
            # file will accept only ERROR and higher level
            'handlers': ['file', 'logentries'],
            'level': DEBUG,


Using ELK Stack

ELK Stack is comprised of three products i.e. ElasticsearchLogstash, and Kibana. All of these are open source products by Elastic.


Elasticsearch is a distributed RESTful search engine built for the cloud and based on Lucene.


Logstash is an open source, server-side data processing pipeline that receives data from a multitude of sources simultaneously transforms it, and then sends it to your target stash.


Kibana is a browser-based analytics and search dashboard for Elasticsearch and is your window into the Elastic Stack.


Apart from collecting the log data, you can also generate graphical dashboards and reports using Kibana.
We will be using an Amazon EC2 instance running Ubuntu 16.04 to collect logs from a local Django app. The ELK stack would be run on a docker.



Setup EC2 Machine with ELK

  1. Launch an Amazon EC2 with Ubuntu 16.04 (with at least 3GB RAM)
  2. SSH into the machine
    sudo ssh -i ~/.ssh/ ubuntu@
  3. Add GPG key for official Docker repository
    sudo apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D
  4. Add Docker repository to APT sources
    sudo apt-add-repository 'deb https://apt.dockerproject.org/repo ubuntu-xenial main'
  5. Update the package database with the Docker packages
    sudo apt-get update
  6. Check that installation source is the Docker repo instead of the default Ubuntu 16.04 repo
    apt-cache policy docker-engine

    The output should be similar to

    Installed: (none)
    Candidate: 17.05.0~ce-0~ubuntu-xenial
    Version table:
       17.05.0~ce-0~ubuntu-xenial 500
          500 https://apt.dockerproject.org/repo ubuntu-xenial/main amd64 Packages
       17.04.0~ce-0~ubuntu-xenial 500
          500 https://apt.dockerproject.org/repo ubuntu-xenial/main amd64 Packages
       17.03.1~ce-0~ubuntu-xenial 500
          500 https://apt.dockerproject.org/repo ubuntu-xenial/main amd64 Packages
  7. Notice that docker-engine is not installed, but the candidate for installation is from the Docker repository for Ubuntu 16.04. The docker-engine version number might be different.
  8. Install Docker
    sudo apt-get install -y docker-engine
  9. Docker should now be installed, the daemon started, and the process enabled to start on boot. Check that it’s running
    sudo systemctl status docker

    The output should be similar to

    ● docker.service - Docker Application Container Engine
       Loaded: loaded (/lib/systemd/system/docker.service; enabled; vendor preset: enabled)
       Active: active (running) since Mon 2017-05-29 00:20:54 UTC; 32s ago
       Docs: https://docs.docker.com
    Main PID: 3223 (dockerd)
       CGroup: /system.slice/docker.service
             ├─3223 /usr/bin/dockerd -H fd://
             └─3228 docker-containerd -l unix:///var/run/docker/libcontainerd/docker-containerd.sock --metrics-interval=0 --start-timeout 2m --state-dir /var/run/docker/li
  10. Docker by default requires root privileges, to avoid adding sudo to every docker command add the user to docker group
    sudo usermod -aG docker $(whoami)
  11. Logout from the machine and ssh again
  12. Install Docker compose
    sudo curl -o /usr/local/bin/docker-compose -L "https://github.com/docker/compose/releases/download/1.11.2/docker-compose-$(uname -s)-$(uname -m)"
  13. Next, we’ll set the permissions:
    sudo chmod +x /usr/local/bin/docker-compose
  14. Verify if the installation was successful by checking the version
    docker-compose -v

    Output should be

    docker-compose version 1.11.2, build dfed245
  15. Instead of installing Elasticsearch, Logstash, and Kibana separately we’ll be using this GitHub repo docker-elk.
  16. Setup Docker Container for ELK
    git clone https://github.com/deviantony/docker-elk
  17. Launch the stack
    cd docker-elk
    docker-compose up -d
  18. After sometime check the status of your containers
    docker ps

    The output should be similar to

    CONTAINER ID        IMAGE                     COMMAND                  CREATED             STATUS              PORTS                                            NAMES
    8575dcd40d91        dockerelk_kibana          "/bin/sh -c /usr/l..."   10 hours ago        Up 10 hours>5601/tcp                           dockerelk_kibana_1
    ef994edf6107        dockerelk_logstash        "/usr/local/bin/do..."   10 hours ago        Up 10 hours         5044/tcp,>5000/tcp, 9600/tcp       dockerelk_logstash_1
    22032232767e        dockerelk_elasticsearch   "/bin/bash bin/es-..."   10 hours ago        Up 10 hours>9200/tcp,>9300/tcp   dockerelk_elasticsearch_1
  19. Now test if Kibana is running by going to <ec2-ip>:5601

NOTE: In case you face Elasticsearch out of memory error you would need to increase the max_map_count
vi /etc/sysctl.conf Add vm.max_map_count = 262144

Using ELK in Django

  1. In your settings.py file, add
    # ...
    'handlers': {
        # ...
        'logstash': {
            'level': 'INFO',
            'class': 'logstash.TCPLogstashHandler',
            'port': 5000,
            'host': '', # IP/Name of the EC2 instance,
            'version': 1,
            'message_type': 'logstash',
            'fqdn': True,
            'tags': ['meetnotes'],
'logger': {
        # ...
        'app_name': {
            # file will accept only ERROR and higher level
            'handlers': ['file', 'logstash'],
            'level': DEBUG,


Things to Note




Centralized logging can be set up easily and it will certainly be of help when we have debugging issues. So don’t wait for the moment when you feel that life would have been easier if we had logs.

Have a question?

Need Technology advice?


+1 669 253 9011


linkedIn youtube