Tenorflow model deployment: provides model online prediction services using flask+docker+tensorflow_serving.
flask: Lightweight web framework, easy to use.
nginx: Good handling of high concurrency.
sudo pip3 install flask sudo apt-get install nginx sudo pip3 install uwsgi
docker: Model is easy to deploy without affecting other environments
tensorflow_serving: Official model deployment
(You can configure the API directly using tensorflow_service, but you still use flask)
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - sudo apt-get update sudo apt-get install docker-ce docker ps #Check docker installation success sudo su #Switch to root, ignore if already root docker pull tensorflow/serving #The ubuntu here is the blogger's username. Please change it to your own, or you can skip the next two steps mkdir /home/ubuntu/tensorflow_serving cd /home/ubuntu/tensorflow_serving #clone tensorflow/serving git clone https://github.com/tensorflow/serving #Here's a model that comes with tensorflow, and if it succeeds, your environment has been successfully deployed. docker run -d -p 8500:8500 --mount \ type=bind,\ source=/home/ubuntu/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu,\ target=/models/half_plus_two -t --name ner tensorflow/serving #If no error occurs, an encoding similar to the following will appear 003ec8a7b0b4fbf53159d0e1fe46162f35b2ab4707ec8782b331fbb33f39dc57 #Run docker ps docker ps #The following results will appear. CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 003ec8a7b0b4 tensorflow/serving "/usr/bin/tf_serving..." 3 minutes ago Up 3 minutes 0.0.0.0:8500->8500/tcp, 8501/tcp ner
If you do this successfully, your environment is already configured. Next, start writing flask code and transforming the pb model, and finally finish deploying the model online.