Code Monkey home page Code Monkey logo

django-celery-redis-awseb's Introduction

Django-Celery-SQS-AWSEB

Deploying Django application with Celery and Redis as broker on AWS Elastic Beanstalk

UPDATE: Elastic Cache Redis instance on AWS can cost you much more than SQS will, so I have switched from Redis to SQS. SQS even has 1 million requests free every month.

CAVEAT: Only thing that SQS Celery broker lacks is the result backend, which is not yet available for SQS, which basically means that you cannot store the results that are returned from your tasks anywhere,

result = some_task.delay() result.get() # gives error -> Result backend 'sqs' not found.

Getting Started

To get started you need to first launch a Elastic Beanstalk environment inside a VPC. I assume you have a VPC already created if not create.

NOTE: AWS RDS won't create a Subnet Group until you have subnets in two availability zones. So for that go into your VPC console > Subnets then create two new subnets with availability zones different from your default subnets, if you have any problems with the ** IPv4 CIDR block** then enter 10.0.2.0/24 and 10.0.3.0/24. If you have another range for example 35.23.2.0/24 then use that but these CIDR Blocks should not overlap with the existing ones.

Now run the following commands for launching EB Environment inside newly created VPC.

  1. eb create test -db.engine postgres -db.user rootuser -db.pass rootpass --vpc --vpc.dbsubnets xxxxxxxxxxx,xxxxxxxxxxx,xxxxxxxxxxxxx,xxxxxxxxxxxx

    --vpc.dbsubnets -> Subnets of the RDS that you just created

  2. You will be prompted to enter some details about the VPC you have just created like the VPC Id which you can get from the VPC Console and you should assign a public IP address and for the EC2 instance subnet enter the private subnet which defaults to 10.0.1.0/24 and for the EB subnet groups enter the public subnet which defaults to 10.0.0.0/24.

  3. Assign a external/public load balancer and for the security group enter the default VPC Security group that is newly created by your VPC in the section VPC Console > Security Groups.

  4. The EB Env should take about 5 to 10 minutes to finish and start.

  5. Now if you haven't set the WSGIPath setting then use eb config and in the window that appears find the WSGIPath and replace that path with your django project wsgi.py file like, WSGIPath: boilerplate/wsgi.py. Save the file and close the window, the env will start updating.

  6. Run eb status and copy paste the CNAME in your ALLOWED_HOSTS settings.

  7. Deploy the application and you run eb open.

SQS

Head to the AWS SQS (Simple Queue Service) console and Create New Queue.

  1. Enter the name of the queue.

  2. Standard Queue should work with most of the use cases.

  3. In Configure Queue you can use the Use SE.

  4. Click create queue

Copy the queue name and paste it in the CELERY_DEFAULT_QUEUE setting in your settings.py file, you can even use the environment variables of your elasticbeanstalk environment to hide the name of your queue like I did in my settings.

With this done, you also need a AWS IAM User with a Role or Group that has permissions to access the Simple Queue Service, you can create it easily with programmatic access and use the credentials generated in your environment.

Define two environment variables namely, AWS_ACCESS_KEY_ID & AWS_SECRET_ACCESS_KEY by EB Console > Configuration > Software.

Create a file named celeryapp.py along side your settings.py file with code same as this repo contains. You may need to change some things like,

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'boilerplate.settings') -> boilerplate is my project name yours might be different, change it everywhere.

In your settings define some more celery configuration settings like

CELERY_DEFAULT_QUEUE = os.getenv('CELERY_DEFAULT_QUEUE') # name of your sqs queue

AWS_SECRET_ACCESS_KEY = os.getenv('AWS_SECRET_ACCESS_KEY')

CELERY_RESULT_BACKEND = None  # Disabling the results backend since not supported with SQS

BROKER_TRANSPORT_OPTIONS = {
    'polling_interval': 20,
    'region': 'ap-south-1',
}
BROKER_URL = "sqs://"

The BROKER_URL here takes only "sqs://" and the rest of the things are done by celery itself by colleting the necessary credentials for access to this SQS queue using the credentials we have set in your environment variables.

NOTE: I first used celery.py named file that was causing some errors despite of the absolute_import on top of the file so finally I changed the file name to celeryapp.py and that worked.

Also you need the .ebextensions folder my this repo to run the celery worker and don't forget to install and use the requirements.txt file same as mine.

We use pooling in celery worker from the gevent package by providing and extra argument

celery -A boilerplate worke -l Info -P gevent --app=boilerplate.celeryapp:app

--app is to give the location of our celery app since we have changed the file names from celery.py to celeryapp.py.

I have tested everything and my repo works perfectly fine in the elastic beanstalk environment. If you face any problems checkout the repo code and make changes accordingly in case I have missed something in the instructions or open a issue.

django-celery-redis-awseb's People

Contributors

alexmhack avatar dependabot[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

django-celery-redis-awseb's Issues

Celery beat - Periodic tasks not running.

Hi, thank you very much for sharing this repo as it has helped me get celery set up with SQS for a project I am working on.

I wanted to add django-celery-beat to this set up however and I can't seem to get it to work. I create periodic task in the admin and set it to run on an interval but the task never seems to be sent to the queue. I am able to run the tasks manually in the admin by selecting them and choosing Run selected tasks option which successfully completes the task.

All my other non periodic tasks are sent to the queue successfully so it is just an issue with the celery beat config I think.

In the celery_configuration.txt I have uncommented the code and uncommented the code from 02_packages.config that you have included there but I am not entirely sure if I have set this up properly.

Here is how I have modified the celery_configuration.txt file:
celery_configuration.txt

And here is the 02_packages.config file:

container_commands:
  01_create_celery_beat_configuration_file:
    command: "cat .ebextensions/files/celery_configuration.txt > /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh && chmod 744 /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh && sed -i 's/\r$//' /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
  02_start_celery_beat:
    command: "/usr/local/bin/supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-beat"
    leader_only: true
  03_start_celery_worker:
    command: "/usr/local/bin/supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-worker"

Am I missing something or set up something incorrectly?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.