Dim Sum Classifier – from Data to App part 2

In the previous post, we see how we can acquire data, process, clean and train an Image Classifier to identify some yummy dim sums.

In this post, we shall look at completing the loop by developing the web app using starlette (a framework similar to that of flask but supports asynchronous IO), setting up and automating deployment of our web app with Github, Docker contanier and Render.

The very helpful fast.ai course team and community has given us a quick start with the following resources:

Uploading the model

In part 1, we used learn.export to export a file named export.pkl (no surprises there). We will need to host it in a cloud service separately. One reason is that model files are typically large files (in our case its 98MB) that usually do not work well with Github unless using extensions like Git LFS. One benefit is that we do not need to redeploy our docker image when we update the model file.

For this app, we shall be using Google Drive as the hosting service. As per the deployment guide, we shall use this Google Drive Direct Link Generator. Doing this enables the model file to be downloaded bypassing the intermediate download prompts.

Customizing and testing the web app

Using the aforementioned quick start template, we need to customized the application for our classifier. Below are some of the notable changes:

  1. Update the model’s direct link in export_file_url

This will be the link generated previously.

  1. Update classes labels

You can get this list by using the .classes method of ImageDataBunch (data.classes line in previous post).

To provide a nicer output with Chinese characters, the labels were updated accordingly as below.

classes = ['char siu sou (叉烧酥)', 'chee cheong fun (猪肠粉)', 'har gow (虾饺)', 'lo bak go (萝卜糕)', 'siu mai (烧卖)']
  1. Change prediction to output numerical values

A corresponding change is made to the line below to output the numerical values so as to reference the above labels.

prediction = classes[int(learn.predict(img)[1])]

Before running the app locally, we need to ensure the dependencies are installed by executing the command:

pip install -r requirements.txt

If your environment already has the fastai environment set up, you can just install the outstanding packages like starlette, uvicorn etc

Run the app locally by executing the command below and open localhost:5000 in your browser:

python app/server.py serve

You can see the outputs of the local instance below.

Once tested, make sure you push your changes to github for deployment later. Remember to remove the model file from the models/ folder.

Setting up your Render account and deploying the app

From the Render Deployment guide, click on the Sign up link and proceed to sign up for an account. You do not need a credit card.

Using the Render service gives us some benefits like application cloud hosting and CI/CD capabilities like auto rebuilding and deploying of Docker containers upon commits into target branch.

After verifying your email and logging into the account, Select New Web Service from the Services page

IN the next page, click on Connect Github to grant Render access.

After signing into your Github account, you will need to choose your account and select your target repository. In this case, we choose the personal account and grant Render access to only the dimsum_classifier_fastai repository.

After linking your repository, setup the web service configuration. In this case the name is selected as dimsum_fastai (this will be your subdomain) and Environment selected is Docker, which will use the Dockerfile within the repository to build the container image. Ensure that you are choosing the correct branch, here it is master. With no credit card information, only the Starter plan can be chosen.

After clicking on Create Web Service, you will see a command line screen showing the build process (building of docker container, installing dependencies, pushing image to container registry, setup of web service etc). Wait for the process to finish (you will see the status turn to ‘live’)

Once live, you will be able to access the app via https://<subdomain name>.onrender.com. In our case, it will be https://dimsum-fastai.onrender.com (note that the underscore has been changed to hyphen).

Below are the images when accessing the live link.

Stop (Suspend) & Delete Web Service

To take down the web service after testing, you can delete or suspend the the web service. In your service, click on settings and scroll to the bottom. There you can select either Delete Web Service or Suspend Web Service. Please see the images below.


We demonstrated in 2 posts the entire life cycle from data acquisition to application and model deployment. Some elements can be improved upon to make the app more 12-factor compliant like managing the configurations like the model configurations in the environment and to convert the prediction functionality into another REST API.

Code links can be found below: