I have an API in Django (restframework) actually hosted in a Elastic Beanstalk in single docker container template Web server using LoadBalancer and AutoScale provided by AWS EB.
I want to background API because it deals with many requests hits in the same sec causing many 5xx http errors. After some research I found that Celery can make this work for me using SQS, but I don’t know how to develop this scenario (I can run the API with Celery workers locally, but I don’t know how to make this on AWS EB or even in a docker-compose).
Should I use Elastic Beanstalk Worker environment and/or docker-compose?
Can I put Django and Celery in a docker-compose and deploy it on EB Web or I need to change environment tier in this case?
What kind of changes I need to make in my actual project (Docker-Django-EBWeb) to background tasks?
Note: I need some configs in container image to integrate past Oracle 11g DB (instant client 11g + cx-Oracle5.3)