Scheduling emails with celery in Django

June 5, 2013
3 mins

After a long journey with Django, you come to a place where you feel the need to get some asynchronous tasks done without any human supervision. Some tasks need to be scheduled to run once at a particular time or after some time, and some tasks have to be run periodically like crontab. One such task is sending emails using specific triggers.

HackerEarth sends a major chunk of emails to recruiters and participants after a contest or when a participant triggers the finish-test button. We did this using crontab. Now, things have changed, and scaling with such a process is time- and resource-intensive. Also, looking into a database to check for any task with the crontab process is not a good idea, at least not for those tasks that have to run only once in its lifetime.

Django-Celery

Here, Django-Celery comes to the rescue. Celery gets asynchronous tasks done and also supports scheduling of tasks. Integrating Celery with the Django codebase is easy enough. You just need to be patient enough to go through the steps given in the official Celery site. Celery requires a solution to send and receive messages, which is typically in the form of a separate service called a message Broker. We use the default broker RabbitMQ to get this done. A Worker fetches the tasks from the queue at the time at which they were scheduled to run asynchronously. You will have to download Celery init scripts to run the worker as a daemon in Production. You can get those init scripts from GitHub.
This is the configuration we used to run Celery in our project:

Another Problem

After linking triggers to send emails once the contest was over or the participant had finished the test prematurely, all things worked properly. Now we could easily schedule an asynchronous task at any time. But we found that there was no method to check if a particular task associated with some Model instance was already scheduled. This happens when there is more than one trigger for the same task, and it can easily happen in a fairly complicated system. To get this done, we had to store the task_id with that model instance into the database using generic ContentType. So here is the hack that we came up with:
Generic ModelTask

This model stores the information of the scheduled task(task_id, name) and the information of the Model instance to which the task is associated.

A custom overridden task decorator ‘model_task’

Overrides the methods : ‘applyasync’ & ‘AsyncResult’ And attaches a new method : ‘existsfor’

That’s it.

The Use Case

Participation Model

This model contains the information of a User participating in an Event.

Task for sending email to participant @model_task() def send_email_on_participation_complete(participation): code for sending email ... ...
Scheduling the task:

Check if the task associated with a participation object was already scheduled:

Get the AsyncResult object:

All this replaced the cron jobs, custom scripts, and some manual tasks with a robust task (email) scheduling mechanism. This also triggered many other types of tasks on top of the Django-Celery architecture we set up. This will certainly make us more efficient and help us  focus on other core products while performing asynchronous tasks.

P.S. I am an undergraduate student at IIT Roorkee.You can reach out to me at shubham@hackerearth.com for any suggestion, bug, or improvement. You can also find me @ShubhamJain.

This post was originally written for the HackerEarth Engineering blog by Shubham Jain, Summer Intern 2013 @HackerEarth.

  •  
  •  
  •  
  •  
  •  

About the Author

Guest Author
Our guest articles are a collection of the best contributions made by members of the developer community on our blog. Discover articles on a wide range of topics, shared by top programmers across the world.

Want to stay ahead of the technology curve?

Subscribe to our Developers blog


Yes, I would like to receive the latest information on emerging technology trends, as well as relevant marketing communication about hackathons, events and challenges. By signing up you agree to our Terms of service and Privacy policy.