tensorflow serving slim配置流程
Posted 帅气的小王子
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了tensorflow serving slim配置流程相关的知识,希望对你有一定的参考价值。
发现国内tensorflow serving方面的博文非常少,包括国外也不多,主要依靠官方文档,但官方文档有些碎片化并包含一些冗余,因此整合为一篇文章方便国人。
之前在我在生产环境下用tensorflow的方式是启动一个thrift服务,load进模型之后不断处理请求,但是在面对大批量数据时有些力不从心,对机器的压力很大,因此考虑使用tensorflow serving的方式部署服务。
服务调用的模型是slim下的inception v3 图像分类模型,对模型进行了迁移学习,得到了新的分类模型,运用该模型对单张图片或者批量图片做预测或者提取特征。
本文假设你已成功安装tensorflow,并利用slim的代码,迁移学习获得了自己的inception v3模型,本文主要阐述如何将该模型部署到tensorflow serving 上。
tensorflow 版本:r.1.3 , python版本:2.7.8 ,无GPU
Step1:安装tensorflow Serving r1.3
参考官方文档:https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/setup.md
前期准备:
安装bazel
pip install tensorflow-serving-api
pip install grpcio
安装过程:
1.git clone --recurse-submodules https://github.com/tensorflow/serving
cd serving
2. cd tensorflow
./configure
You have bazel 0.7.0- (@non-git) installed.
Please specify the location of python. [Default is /data0/home/wang16/local/python/bin/python]:
Found possible Python library paths:
/data0/home/wang16/local/python/lib/python2.7/site-packages/
/data0/install/caffe/python/
/data0/home/wang16/simba/trunk/src/content_analysis/
/data0/home/wang16/local/python/lib/python2.7/site-packages
Please input the desired Python library path to use. Default is [/data0/home/wang16/local/python/lib/python2.7/site-packages/]
Do you wish to build TensorFlow with jemalloc as malloc support? [Y/n]:
jemalloc as malloc support will be enabled for TensorFlow.
Do you wish to build TensorFlow with Google Cloud Platform support? [Y/n]: n
No Google Cloud Platform support will be enabled for TensorFlow.
Do you wish to build TensorFlow with Hadoop File System support? [Y/n]: n
No Hadoop File System support will be enabled for TensorFlow.
Do you wish to build TensorFlow with Amazon S3 File System support? [Y/n]: n
No Amazon S3 File System support will be enabled for TensorFlow.
Do you wish to build TensorFlow with XLA JIT support? [y/N]:
No XLA JIT support will be enabled for TensorFlow.
Do you wish to build TensorFlow with GDR support? [y/N]:
No GDR support will be enabled for TensorFlow.
Do you wish to build TensorFlow with VERBS support? [y/N]:
No VERBS support will be enabled for TensorFlow.
Do you wish to build TensorFlow with OpenCL support? [y/N]:
No OpenCL support will be enabled for TensorFlow.
Do you wish to build TensorFlow with CUDA support? [y/N]:
No CUDA support will be enabled for TensorFlow.
Do you wish to build TensorFlow with MPI support? [y/N]:
No MPI support will be enabled for TensorFlow.
Please specify optimization flags to use during compilation when bazel option "--config=opt" is specified [Default is -march=native]:
Add "--config=mkl" to your bazel command to build with MKL support.
Please note that MKL on MacOS or windows is still not supported.
If you would like to use a local MKL instead of downloading, please set the environment variable "TF_MKL_ROOT" every time before build.
Configuration finished
cd ..
3. bazel build -c opt --copt=-msse4.1 --copt=-msse4.2 --copt=-mavx --copt=-mavx2 --copt=-mfma --copt=-O3 tensorflow_serving/...(耗时较长)
这里如果是centos可能会报错:fatal error: stropts.h: No such file or directory
在/usr/include建一个空的文件。命名stropts.h就可以了。
4. bazel test -c opt tensorflow_serving/...
Step2:修改代码以支持新的模型
参考文档:https://gyang274.github.io/docker-tensorflow-serving-slim/0x02b00.slim.inception.v4.html,https://github.com/gyang274/docker-tensorflow-serving-slim
参考文档:https://www.tensorflow.org/serving/serving_inception
这部分是最难的,官方提供的示例代码很少,幸亏找到了一篇相关文档,照猫画虎,才算完成。
1. 回到 serving 主目录,目录下有如下内容
mkdir -p tf_checkpoints/slim/my_inception_v3
拷贝迁移学习训练好的模型到该目录中,拷贝完目录内容如下:
2.参照示例写模型导出代码:tensorflow_serving/example/my_inception_v3_saved_model.py
|
参照示例写模型服务代码:tensorflow_serving/example/my_inception_v3_client.py
|
参照参考文献修改:tensorflow_serving/example/BUILD
|
参照参考文献修改:/tensorflow_serving/workspace.bzl
|
修改代码: /tf_models/research/slim/BUILD,注释掉所有 "//tensorflow",
编译代码:
bazel build -c opt --copt=-msse4.1 --copt=-msse4.2 --copt=-mavx --copt=-mavx2 --copt=-mfma --copt=-O3 tensorflow_serving/...
提取模型:
bazel-bin/tensorflow_serving/example/my_inception_v3_saved_model \\
--checkpoint_dir=tf_checkpoints/slim/my_inception_v3 \\
--output_dir=tf_servables/slim/my_inception_v3 \\
--model_version=1 \\
--image_size=299
提取后的模型目录:
Step3:部署服务
1、部署服务
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --model_name=my_inception_v3 --model_base_path=$PWD/tf_servables/slim/my_inception_v3 --port=9000
2、测试服务
python my_inception_v3_client.py
3、返回结果
outputs
key: "classes"
value
dtype: DT_STRING
tensor_shape
dim
size: 1
dim
size: 5
string_val: "BeautifulSignt\\n"
string_val: "Building\\n"
string_val: "Cartoon\\n"
string_val: "Society\\n"
string_val: "Trip\\n"
outputs
key: "prelogits"
value
dtype: DT_FLOAT
tensor_shape
dim
size: 1
dim
size: 1
dim
size: 1
dim
size: 2048
float_val: 0.29251652956
float_val: 0.283589184284
float_val: 0.0797990858555
float_val: 0.427444338799
float_val: 0.147782608867
...
float_val: 0.784989655018
float_val: 0.0607083588839
float_val: 0.256205767393
float_val: 0.177307203412
outputs
key: "scores"
value
dtype: DT_FLOAT
tensor_shape
dim
size: 1
dim
size: 5
float_val: 0.995890200138
float_val: 0.00202985620126
float_val: 0.000434114364907
float_val: 0.000334182434017
float_val: 0.00018374362844
以上是关于tensorflow serving slim配置流程的主要内容,如果未能解决你的问题,请参考以下文章
Tensorflow(辅助工具)--tensorflow slim(TF-Slim) 使用笔记