tensorflow serving目录解读
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了tensorflow serving目录解读相关的知识,希望对你有一定的参考价值。
tf_serving-----WORKSPACE|
-----tensorflow-serving/----BUILD
| |
| -----workspace.bzl
| |
| -----example/-------BUILD
| |
| --------imagenet_lsvrc_2015_synsets.txt
| |
| --------imagenet_metadata.txt
| |
| --------inception_client.cc
| |
| --------inception_client.py
| |
| --------inception_k8s.yaml
| |
| --------inception_saved_model.py
| |
| --------mnist_client.py
| |
| --------mnist_iput_data.py
| |
| --------mnist_saved_model.py
|
-----------tensorflow/-------BUILD
| |
| --------WORKSPACE
| |
| ----tensorflow/------BUILD
| |
| -------workspace.bzl
|
-----------tf_models/----WORKSPACE
|
-----official/
|
-----tutorials/
|
-----research/-------inception/-------------------WORKSPACE
|
-------------------inception/--------BUILD
|
---------inception_train.py
|
---------inception_model.py
|
---------inception_eval.py
|
---------inception_distributed_train.py
解读:
项目文件夹名称为"tf_serving",WORKSPACE文件放在"tf_serving"项目文件夹的根目录
*************************************************************************
tf_serving/WORKSPACE解读:
*************************************************************************
#声明工作空间名称,与项目名称一致 workspace(name = "tf_serving") #声明本地仓库'tensorflow'的名称及路径 local_repository(name = "org_tensorflow", path = "tensorflow",) #声明io_bazel_rules_closure http_archive(……) #把新的Tensorflow Serving依赖加到workspace.bzl load("//tensorflow_serving:workspace.bzl","tf_serving_workspace") tf_serving_workspace() #指定bazel最低要求版本 load("@org_tensorflow//tensorflow::workspace.bzl", "check_version") check_version("0.5.4")
*************************************************************************
tf_serving/tensorflow_serving/workspace.bzl解读:
*************************************************************************
#tensorFlow Serving外部依赖加载到WORKSPACE文件 load('@org_tensorflow//tensorflow::workspace.bzl', 'tf_workspace') #此处是所有Tensorflow Serving的外部依赖。 #workspace_dir是Tensorflow Serving repo的绝对路径,如果是作为submodule #连接,路径形式应该是'__workspace_dir__ + "serving"' def tf_serving_workspace(): native.new_local_repository( name = "inception_model", path = "tf_models/research/inception", build_file = "tf_models/research/inception/inception/BUILD", ) tf_workspace(path_prefix = "", tf_repo_name = "org_tensorflow") #gRPC依赖 native.bind( name = "libssl", actual = "@boringssl//:ssl", ) native.bind( name = "zlib", actual = "@zlib_archive//:zlib")
*************************************************************************
tf_serving/tensorflow_serving/BUILD解读:
*************************************************************************
# Tensorflow serving描述 package( default_visibility=["//tensorflow_serving:internal"], ) licenses(["notice"]) exports_files(["LICENSE"]) #开放源代码标记 package_group( name = "internal", package = [ "//tensorflow_serving/...", ], ) filegroup( name = "all_files", srcs = glob( ["**/*"], exclude = [ "**/METADATA", "**/OWNERS", "g3doc/sitemap.md", ], ), )
*************************************************************************
tf_serving/tensorflow/tensorflow/workspace.bzl解读:
*************************************************************************
# 可以在WORKSPACE文件中加载的Tensorflow外部依赖 load(……) def _is_windows(): …… def _get_env_var(): …… # 从'native.bazel_version'中解析bazel版本字符串 def _parse_bazel_version(): …… # 检查正在使用指定版本的bazel def check_version(): …… # 支持将Tensorflow作为submodule的临时工作区 def _temp_workaround_http_archive_impl(): …… # 如果非零code退出,则执行指定参数的命令并调用'fail' def _execute_and_check_ret_code(): …… # 在仓库根目录应用补丁文件 def _apply_patch(): …… # 下载仓库,在根节点应用补丁 def _patched_http_archive_impl(): …… # 如果Tensorflow连接为submodule,path_prefix不再使用 # tf_repo_name正在考虑中 def tf_workspace(): ……
*************************************************************************
tf_serving/tensorflow/tensorflow/BUILD解读:
*************************************************************************
package(default_visibility = [":internal"]) licenses(["notice"]) exports_files([ "LICENSE", "ACKNOWLEDGMENTS", # leakr文件用于//third_party/cloud_tpu "leakr_badwords.dic", "leakr_badfiles.dic", ]) load("//tensorflow:tensorflow.bzl", "tf_cc_shared_object") load("//tensorflow/core:platform/default/build_config.bzl", "tf_additional_binary_deps", ) # 各种config setting config_setting() package_group() filegroup() py_library() filegroup( name = "all_opensource_files", data = [ ":all_files", "//tensorflow/c:all_files", "//tensorflow/cc:all_files", ……], visibility = [':__subpackages__'], ) load("//third_party/mkl:build_defs.bzl", "if_mkl", ) filegroup( name = "intel_binary_blob", data = if_mkl( [ "//third_party/mkl:intel_binary_blob", ], ), ) filegroup( name = "docs_src", data = glob(["docs_src/**/*.md"]), ) tf_cc_shared_object( …… )
以上是关于tensorflow serving目录解读的主要内容,如果未能解决你的问题,请参考以下文章
谷歌把TensorFlow送给世界,世界却用它人工合成色情片!
Tensorflow r1.12及tensorflow serving r1.12 GPU版本编译遇到的问题