Setting both Celery and Flask inside the docker-compose
Due to the issue I need to resolve is that put the heavy task to background, then run it periodically or asynchronously. And the most important thing is that have to support Python.
It seems like a perfect solution for my issue, however, there has still a new problem arised, what is I need to focus on processing the task ran by Flask and packed in docker and running by docker compose, it is difficult to find all of these resources at the same time on the Internet.
So let's started!
My article is referenced from the following resources:
- Asynchronous Tasks with Flask and Redis Queue
- The web-service for extracting dominant color from images.
- Using Celery with Flask
- Flask + Celery tutorial (Mandarin resource)
- Task schduling with Celery (Mandarin resource)
App Structure Displaying
Set up your Redis and Celery worker in docker-compose, the breaking point is to set up the celery worker's name well.
What the Dockerfile in this example looks like.
Initialize Celery worker and create asynchronous or scheduled task in celery_app.py
Run task by Flask and check its status in app.py
After finishing the all setting, we can run the script on terminal below to create container and run it.
Conclusion
The bottleneck in this case is that I could run the Flask, Redis, Celery separately in docker-compose at first, but if I want to run it with only one script then it failed. I had tried and error lots of times to finding the breaking point, the correct script to run Celery in docker-compose. Everything was clear after throughing this bottleneck.
Thanks for reading my first article! Leave your message below my Facebook post if there's still any further questions. See you then.