Skip to content

Deploying a Fashion-MNIST web app with Flask and Docker

Reading Time: 10 minutes

Note: folder with all relevant code.

Disclaimer: This post doesn’t contain anything revolutionary. You can find top-notch (way better than mine) Docker deployments material pretty much anywhere, these days. Nevertheless, (one of) my life motto is “you don’t learn it until you do it”. Reading a cool post is not enough. I had to test it myself. This is what this post is all about: training a simple model, make it available via a web app, and shipping it in a Docker container. Enjoy.

Introduction

I have a confession to make. I have never really used Docker.

I say “really”, as I have actually productionized a couple of models using Azure Container Instances in DataBricks, during my tenure at Mash. Nevertheless, I did that via the azureml Python SDK, so I technically didn’t get my hands dirty with Dockerfiles, and didn’t quite get a chance of opening the black-box which I thought Docker was. This had to change, so a couple of days ago I decided it was time to

  1. learn what all this fuss around Docker was about,
  2. and to deploy a toy ML model in Flask on top of Docker.

The first goal (learning) was achieved mainly by watching this true gem on YouTube. Don’t get intimidated by the 2-hour long video. It is incredibly clear, well structured and at some point, you will just want to keep going. That is, by far, the best resource for complete Docker beginners I was able to find around. Other posts I found useful in different ways are:

Action plan

I achieved the second goal (deployment) putting together a Fashion-MNIST image classifier in Keras and making it available via a Flask web application. The action plan for this part is the following:

  1. Train and save a Fashion-MNIST model in a Jupyter notebook on a Windows machine (my laptop – no GPU).
  2. Write a Flask app that loads the previous model and implements a predict function in localhost (http://127.0.0.1/predict).
  3. Test the local endpoint directly on Windows.
  4. Create and build a Docker image wrapping the model and the web app.
  5. Run a container on top of the image, on Windows, and test that the endpoint works as it did in #3.
  6. Push the image to Docker Hub.
  7. Spin up a Linux EC2 instance on AWS.
  8. Pull the image from Docker Hub.
  9. Run a container on top of the image, on Linux, and test that the endpoint works. If this is the case, we have managed to profit from the Docker magic! From Windows to Linux in 2 lines of code.

Docker magic

Let’s dive in and check out each step one by one.

Train the model

I copy-pasted almost literally the entire code from the MNIST’s Keras example by fchollet, editing it a bit and creating a personal notebook. The process consists of loading + preprocessing the Fashion-MNIST dataset, training a simple CNN, and saving the model to disk.

Write the Flask app

This is quite standard too. Nothing fancy, just an app.py file in which we load the model from disk and implement the inference logic. A predict function gets invoked each time we hit the http://127.0.0.1/predict endpoint with a POST request.

from flask import Flask, request, jsonify
import numpy as np
from tensorflow import keras
app = Flask(__name__)
id2class = {0: "T-shirt/top",
            1: "Trouser",
            2: "Pullover",
            3: "Dress",
            4: "Coat",
            5: "Sandal",
            6: "Shirt",
            7: "Sneaker",
            8: "Bag",
            9: "Ankle boot",}
model = keras.models.load_model("fashion_mnist")
@app.route('/predict', methods=['POST'])
def predict():
    parameters = request.get_json(force=True)
    im = np.array(parameters['image'])
    im = im.astype("float32") / 255
    im = np.expand_dims(im, -1)[None]
    out = id2class[np.argmax(model.predict(im))]
    return out
if __name__ == '__main__':
    app.run(host='0.0.0.0')

Test the local endpoint on Windows

This is just to make sure we got everything right within the Flask app, and to avoid getting useless headaches in Docker. We simply run the app in a terminal (CMD or PowerShell in Windows) and ping the endpoint with a POST request (from within the Jupyter notebook, in my case). If everything is fine, the cell should print “Prediction = <ITEM OF GARMENT>” (Shirt, for me).

python app.py

Write a Dockerfile and build a Docker image

Now we can move to Dockerizing our web application. In a nutshell, this means grabbing an empty box (Docker image) and filling it with all the instructions and materials Docker needs to run the app (python packages, model artifacts, Flask code). The instructions part is covered by a so-called Dockerfile. That’s a plain text file containing a sequence of commands you’d run if you had to set up the Fashion-MNIST application on a new machine. I won’t get into the details of those commands as they are covered in detail in the resources I have shared in the introduction. If you are familiar with the Unix command line you’ll immediately grasp what is going on though. Below, the Dockerfile I put together for my use case.

FROM ubuntu:18.04
LABEL Francesco Pochetti
RUN apt -qq -y update \
	&& apt -qq -y upgrade
RUN apt -y install python3.7
RUN apt -y install python3-pip
RUN python3.7 -m pip install --upgrade pip
RUN which python3.7
RUN which pip3
RUN ln -s /usr/bin/python3.7 /usr/bin/python
RUN ln -s /usr/bin/pip3 /usr/bin/pip
RUN python --version
RUN which pip
COPY . /app
WORKDIR /app    
RUN pip3 install -r requirements.txt
ENTRYPOINT ["python"]
CMD ["app.py"]

A couple of quick notes. Line 19 (COPY . /app) copies everything I have in my local current Windows directory (from which I will run the docker build command) into the /app folder inside the Docker image. This amounts to app.py, requirements.txt, and the fashion_mnist folder containing the Keras model’s artifacts. requirements.txt contains the python packages needed to execute the code inside app.py. Once the Dockerfile ready, we can build the image, which means running the commands inside the Dockerfile to create a template with a frozen python environment. Here how it looks like from a Windows PowerShell terminal (the screenshot has been cut, and it was not my first build, which is why Docker is often invoking the cache).

docker build -t fashion_mnist .

Once the image builds successfully, we can check it was properly created invoking the docker images command, which lists all the images stored on our system.

docker images

Run a Docker container on Windows to test the local endpoint

The next step is to spin up a container on top of the newly built image and try pinging the inference endpoint again. We ran this same test before. The key difference this time is that the Flask web app will be running inside the Docker container and NOT on the container’s host, e.g. my laptop with Windows. For that to work, we have to map the 5000 port on the container to the 5000 port on the host, so that the ping to http://127.0.0.1/predict on Windows will be redirected to Docker. After executing docker run, we send a POST request to http://127.0.0.1/predict and verify we get (hopefully) the same results shown in this screenshot. If that’s the case, we can celebrate. Some Docker magic just happened before our eyes!

docker run -p 5000:5000 --name=fmn fashion_mnist

This is all cool and exciting, but… why do we need that at all?

Here is one good reason. The Fashion-MNIST application is a true success and you are asked to put it in production on AWS for other people to use. You tell yourself that the task is easy. It boils down to spinning up an EC2 instance and make sure your code runs over there as it did on your laptop. And that’s where the problems start. Without Docker, you’d have to reproduce your exact Windows environment on a (very likely) Linux machine. Good luck with that. Docker allows you to download on EC2 the Fashion-MNIST image we just built and run it inside a container. The entire python environment we put together on Windows is going to be recreated on Linux without typing a single line of code. Let’s see if this is actually the case and test it ourselves on AWS.

Push the image to Docker Hub

The first step to moving the code out of my laptop (onto AWS) is to upload the fashion_mnist Docker image onto Docker Hub, a platform where developers share their images with the public. First, create an account on Docker Hub, then fetch the ID of the image you want to push (you can run docker images to get that), then execute the following 3 commands

docker login --username=YOUR_USERNAME
docker tag IMAGE_ID YOUR_USERNAME/IMAGE_NAME_YOU_CHOOSE:TAG_YOU_CHOOSE
docker push YOUR_USERNAME/IMAGE_NAME_YOU_CHOOSE

which render as follows in Windows Powershell

and result in my first Docker image being uploaded to Docker Hub (here). Now we can fire up a Linux machine on AWS and see if we manage to run the Fashion-MNIST classifier.

Spin up a Linux-based EC2 instance

This is just for testing purposes, so I spun up a tiny Linux t2.nano and installed Docker on it, following these instructions.

Then, on the same instance, I wrapped few lines of code into a fashion_mnist.py file, to be able to quickly ping the http://127.0.0.1/predict endpoint, once up and running. The code is basically just a copy-paste of the last cell of this notebook, where line 4 is a 28 \times 28 listified numpy array representing the same Shirt we previously used to test the app locally.

import requests
import json
im = [[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 24, 126, 81, 142, 119, 133, 0, 0, 0, 0, 0, 0, 2, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 44, 96, 103, 138, 95, 174, 201, 146, 126, 118, 99, 38, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 14, 118, 119, 155, 122, 149, 158, 136, 71, 57, 113, 75, 110, 150, 172, 11, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 138, 102, 87, 85, 82, 119, 87, 107, 63, 44, 102, 49, 72, 94, 81, 155, 14, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 30, 140, 87, 51, 90, 119, 76, 107, 82, 102, 138, 160, 126, 71, 93, 121, 183, 155, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 39, 130, 127, 150, 108, 109, 112, 72, 70, 98, 117, 89, 52, 90, 126, 116, 104, 107, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 82, 103, 122, 124, 81, 35, 150, 75, 99, 108, 70, 109, 94, 140, 104, 154, 142, 133, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 56, 112, 117, 123, 196, 98, 65, 82, 119, 44, 77, 142, 52, 80, 102, 141, 140, 124, 2, 0, 0, 0, 0], [0, 0, 0, 0, 0, 89, 130, 140, 93, 75, 122, 154, 130, 118, 108, 38, 141, 141, 81, 131, 124, 138, 147, 16, 0, 0, 0, 0], [0, 0, 0, 0, 0, 154, 123, 177, 196, 68, 99, 122, 102, 43, 117, 126, 63, 75, 67, 114, 183, 158, 154, 8, 0, 0, 0, 0], [0, 0, 0, 0, 0, 81, 93, 109, 216, 61, 113, 85, 149, 137, 167, 104, 122, 170, 114, 149, 234, 116, 128, 40, 0, 0, 0, 0], [0, 0, 0, 0, 0, 135, 87, 141, 221, 76, 110, 116, 118, 80, 127, 119, 223, 142, 58, 112, 230, 116, 219, 62, 0, 0, 0, 0], [0, 0, 0, 0, 0, 145, 145, 119, 186, 141, 82, 108, 103, 79, 105, 72, 114, 28, 87, 118, 210, 158, 140, 110, 0, 0, 0, 0], [0, 0, 0, 0, 3, 90, 89, 187, 178, 57, 45, 196, 108, 93, 172, 102, 133, 121, 153, 144, 220, 103, 155, 107, 0, 0, 0, 0], [0, 0, 0, 0, 5, 135, 104, 149, 164, 173, 127, 66, 107, 123, 56, 71, 140, 118, 94, 71, 204, 147, 149, 114, 0, 0, 0, 0], [0, 0, 0, 0, 19, 160, 112, 170, 209, 123, 165, 133, 98, 160, 109, 81, 172, 56, 130, 124, 210, 131, 142, 119, 0, 0, 0, 0], [0, 0, 0, 0, 43, 137, 153, 173, 62, 81, 90, 165, 178, 62, 118, 15, 133, 182, 20, 147, 216, 182, 191, 156, 0, 0, 0, 0], [0, 0, 0, 0, 20, 114, 138, 155, 121, 124, 124, 82, 85, 150, 73, 98, 77, 68, 95, 77, 99, 179, 156, 99, 0, 0, 0, 0], [0, 0, 0, 0, 34, 118, 136, 211, 59, 90, 107, 130, 175, 108, 137, 124, 178, 212, 109, 107, 138, 237, 232, 138, 0, 0, 0, 0], [0, 0, 0, 0, 70, 191, 177, 200, 137, 124, 160, 119, 53, 31, 103, 165, 179, 66, 59, 98, 116, 238, 165, 181, 0, 0, 0, 0], [0, 0, 0, 0, 31, 110, 218, 184, 119, 161, 81, 141, 149, 126, 141, 59, 100, 84, 167, 177, 93, 254, 142, 107, 17, 0, 0, 0], [0, 0, 0, 0, 59, 105, 207, 212, 52, 128, 44, 201, 128, 84, 158, 132, 175, 110, 151, 87, 114, 219, 147, 158, 20, 0, 0, 0], [0, 0, 0, 0, 58, 131, 204, 196, 84, 247, 130, 63, 109, 147, 61, 52, 153, 107, 85, 133, 102, 205, 159, 119, 52, 0, 0, 0], [0, 0, 0, 0, 118, 155, 219, 151, 128, 110, 188, 141, 126, 124, 154, 76, 147, 123, 89, 186, 117, 150, 211, 132, 45, 0, 0, 0], [0, 0, 0, 0, 95, 126, 164, 72, 8, 44, 54, 91, 197, 68, 113, 116, 86, 177, 17, 87, 11, 53, 177, 94, 59, 0, 0, 0], [0, 0, 0, 0, 80, 76, 195, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 39, 170, 174, 70, 0, 0, 0], [0, 0, 0, 0, 93, 138, 168, 1, 0, 3, 1, 0, 0, 0, 0, 0, 0, 1, 0, 3, 0, 71, 219, 114, 76, 0, 0, 0], [0, 0, 0, 0, 0, 28, 11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 71, 7, 5, 0, 0, 0]]
data = {'image': im}
URL = 'http://127.0.0.1:5000/predict'
def predict():
    result = requests.post(URL, json.dumps(data))
    print("Prediction = " + str({result.text}))
if __name__ == '__main__':
    predict()

We are ready to run some Docker magic.

Run the fashion_mnist Docker container on EC2

Now we just need to execute the following line, which first pulls the frapochetti/fashion_mnist:firsttry Docker image from Docker Hub and then runs a container (with the Flask app inside) on top of it (make sure Docker is running on the machine first, invoking sudo service docker start).

sudo docker run -p 5000:5000 --name=fmn -it frapochetti/fashion_mnist:firsttry

Then, in a second separate terminal run

python fashion_mnist.py

which hopefully returns Prediction = set([u’Shirt’]).

Great! Docker kept its promise. We were able to run a web application entirely developed on Windows, with Flask and Tensorflow dependencies, on a Linux machine literally executing a couple of lines of code. In a matter of minutes, we were up and running. Quite amazing.

Here you go, happy Dockerization everybody!

1 thought on “Deploying a Fashion-MNIST web app with Flask and Docker”

  1. Pingback: Training and Deploying a fully Dockerized License Plate Recognition app with IceVision, Amazon Textract and FastAPI -

Comments are closed.

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading