ValueError: This ORT build has [‘TensorrtExecutionProvider‘, ‘CUDAExecutionProvider‘, ‘CPUExecutionP
Posted AI浩
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了ValueError: This ORT build has [‘TensorrtExecutionProvider‘, ‘CUDAExecutionProvider‘, ‘CPUExecutionP相关的知识,希望对你有一定的参考价值。
在使用onnxruntime GPU版本做推理测试时,出现了如下的错误:
Traceback (most recent call last):
File "D:\\cv\\ConNext_demo\\testonnx.py", line 57, in <module>
rnet1 = ONNXModel(onnx_model_path)
File "D:\\cv\\ConNext_demo\\models\\onnx.py", line 7, in __init__
self.onnx_session = onnxruntime.InferenceSession(onnx_path)
File "D:\\ProgramData\\Anaconda3\\lib\\site-packages\\onnxruntime\\capi\\onnxruntime_inference_collection.py", line 335, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "D:\\ProgramData\\Anaconda3\\lib\\site-packages\\onnxruntime\\capi\\onnxruntime_inference_collection.py", line 361, in _create_inference_session
raise ValueError("This ORT build has enabled. ".format(available_providers) +
ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)
解决办法:
将
onnxruntime.InferenceSession(onnx_path)
改为:
onnxruntime.InferenceSession(onnx_path,providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'])
以上是关于ValueError: This ORT build has [‘TensorrtExecutionProvider‘, ‘CUDAExecutionProvider‘, ‘CPUExecutionP的主要内容,如果未能解决你的问题,请参考以下文章
HDF5 min_itemsize 错误:ValueError: Trying to store a string with len [##] in [y] column but this colum
valueError: This solver needs samples of at least 2 classes in the data, but the data contains only
解决在django中应用keras模型时出现的ValueError("Tensor %s is not an element of this graph." % obj)问题(代码