Working example for serving a ML model using FastAPI and Celery, from this article. This fork includes a docker-compose
taken from here and modified to use only Redis and drops RabbitMQ used in the original example.
Run docker-compose build
then docker-compose up
then navigate to http://localhost:8000/docs. Example data for inferencing is in test_client.py
Build the app seperately:
docker build -t ml-app .
Note there are many requirements, so I had to unpin a bunch of these to successfully build.
This model is trained on data on kaggle. I also had to upate the training notebook since there are some bad columns on the dataset on Kaggle.