docker+tensroflow_serving deployment tensorflow model (environment preparation)

Tenorflow model deployment: provides model online prediction services using flask+docker+tensorflow_serving.

Install flask, uwsgi, nginx, docker, tensorflow_serving

flask: Lightweight web framework, easy to use.
nginx: Good handling of high concurrency.
apache: stability

sudo pip3 install flask
sudo apt-get install nginx
sudo pip3 install uwsgi

docker: Model is easy to deploy without affecting other environments
tensorflow_serving: Official model deployment
(You can configure the API directly using tensorflow_service, but you still use flask)

curl -fsSL | sudo apt-key add -
sudo apt-get update
sudo apt-get install docker-ce
docker ps #Check docker installation success
sudo su #Switch to root, ignore if already root
docker pull tensorflow/serving
#The ubuntu here is the blogger's username. Please change it to your own, or you can skip the next two steps
mkdir /home/ubuntu/tensorflow_serving 
cd /home/ubuntu/tensorflow_serving 

#clone tensorflow/serving
git clone
#Here's a model that comes with tensorflow, and if it succeeds, your environment has been successfully deployed.
docker run -d -p 8500:8500 --mount \
target=/models/half_plus_two -t  --name ner tensorflow/serving
#If no error occurs, an encoding similar to the following will appear

#Run docker ps
docker ps
#The following results will appear.
CONTAINER ID        IMAGE                COMMAND                  CREATED             STATUS              PORTS                              NAMES
003ec8a7b0b4        tensorflow/serving   "/usr/bin/tf_serving..."   3 minutes ago       Up 3 minutes>8500/tcp, 8501/tcp   ner

If you do this successfully, your environment is already configured. Next, start writing flask code and transforming the pb model, and finally finish deploying the model online.

Tags: Docker sudo Ubuntu Nginx

Posted on Fri, 30 Aug 2019 21:44:48 -0700 by sentback