triton-inference-server启动报Invalid argument: unexpected inference

Posted 修炼之路

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了triton-inference-server启动报Invalid argument: unexpected inference相关的知识,希望对你有一定的参考价值。

错误信息

在启动tritonserver的时候报错,错误信息如下:

I0625 14:41:46.915214 1 cuda_memory_manager.cc:103] CUDA memory pool is created on device 0 with size 67108864
I0625 14:41:46.978097 1 model_repository_manager.cc:1065] loading: resnet152:1
I0625 14:42:16.968665 1 plan_backend.cc:365] Creating instance resnet152_0_gpu0 on GPU 0 (8.6) using model.plan
E0625 14:42:19.232329 1 model_repository_manager.cc:1242] failed to load 'resnet152' version 1: Invalid argument: unexpected inference output 'output', allowed outputs are: features
I0625 14:42:19.232502 1 server.cc:500] 
+------------------+------+
| Repository Agent | Path |
+------------------+------+
+------------------+------+

I0625 14:42:19.232575 1 server.cc:527] 
+-------------+-----------------------------------------------------------------+--------+
| Backend     | Path                                                            | Config |
+-------------+-----------------------------------------------------------------+--------+
| pytorch     | /opt/tritonserver/backends/pytorch/libtriton_pytorch.so         | {}     |
| tensorflow  | /opt/tritonserver/backends/tensorflow1/libtriton_tensorflow1.so | {}     |
| onnxruntime | /opt/tritonserver/backends/onnxruntime/libtriton_onnxruntime.so | {}     |
| openvino    | /opt/tritonserver/backends/openvino/libtriton_openvino.so       | {}     |
+-------------+-----------------------------------------------------------------+--------+

I0625 14:42:19.232630 1 server.cc:570] 
+-----------+---------+----------------------------------------------------------------------------------------------------+
| Model     | Version | Status                                                                                             |
+-----------+---------+----------------------------------------------------------------------------------------------------+
| resnet152 | 1       | UNAVAILABLE: Invalid argument: unexpected inference output 'output', allowed outputs are: features |
+-----------+---------+----------------------------------------------------------------------------------------------------+

错误原因分析

通过上面的错误信息中看出UNAVAILABLE: Invalid argument: unexpected inference output 'output', allowed outputs are: features这个是由于输出的节点名称错误导致的,就是模型中的节点名称和模型配置文件中定义的名称不一致导致的

解决办法

修改模型的配置文件,在Model Repository目录下的,models/resnet152/config.pbtxt

  platform: "tensorrt_plan"
  max_batch_size: 8
  input [
    {
      name: "input"
      data_type: TYPE_FP32
      dims: [3,224,224]
    }
  ]
  output [
    {
      name: "output"
      data_type: TYPE_FP32
      dims: [ 2048,1,1 ]
    }
  ]

将上面output中的name改为features重新启动即可

以上是关于triton-inference-server启动报Invalid argument: unexpected inference的主要内容,如果未能解决你的问题,请参考以下文章

triton-inference-server报Error details: model expected the shape of dimension 0 to be between

VirtualBox启动模式分析(正常启动/无界面启动/分离式启动)

android 性能优化 -- 启动过程 冷启动 热启动

windows的服务启动类型的延迟启动要系统开机后多少秒才自动启动?

uefi启动中删除多余的启动项

windowsxp软件添加启动项怎么延迟启动?