mut*_*oid 5 python pytorch onnxruntime
我正在尝试使用 onnxruntime-gpu 进行推理。因此,我在我的系统上安装了 CUDA、CUDNN 和 onnxruntime-gpu,并检查我的 GPU 是否兼容(下面列出的版本)。
当我尝试启动推理会话时,我收到以下警告:
>>> import onnxruntime as rt
>>> rt.get_available_providers()
['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider']
>>> rt.InferenceSession("[ PATH TO MODEL .onnx]", providers= ['CUDAExecutionProvider'])
2023-01-31 09:07:03.289984495 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:578 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Please reference https://onnxruntime.ai/docs/reference/execution-providers/CUDA-ExecutionProvider.html#requirements to ensure all dependencies are met.
<onnxruntime.capi.onnxruntime_inference_collection.InferenceSession object at 0x7f740b4af100>
Run Code Online (Sandbox Code Playgroud)
但是,如果我先导入 torch,推理就会在我的 GPU 上运行,并且一旦开始推理会话,我就会看到我的 python 程序列在 nvidia-smi 下:
$ python
Python 3.8.16 (default, Dec 7 2022, 01:12:06)
[GCC 11.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> import onnxruntime as rt
>>> sess = rt.InferenceSession("PATH TO MODEL . onnx", providers=['CUDAExecutionProvider'])
>>>
Run Code Online (Sandbox Code Playgroud)
有谁知道为什么会这样?进口订单很重要;如果我在导入 onnxruntime 后导入 torch,我会收到与未导入 torch 相同的警告。
我检查了__init__torch 包的 ,并找到了加载有用的代码行libtorch_global_deps.so:
import ctypes
lib_path = '[ path to my .venv38]/lib/python3.8/site-packages/torch/lib/libtorch_global_deps.so'
ctypes.CDLL(lib_path, mode=ctypes.RTLD_GLOBAL)
Run Code Online (Sandbox Code Playgroud)
$ python
Python 3.8.16 (default, Dec 7 2022, 01:12:06)
[GCC 11.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import ctypes
>>> lib_path = '[ path to my .venv38]/lib/python3.8/site-packages/torch/lib/libtorch_global_deps.so'
>>> ctypes.CDLL(lib_path, mode=ctypes.RTLD_GLOBAL)
>>> import onnxruntime as rt
>>> sess = rt.InferenceSession("PATH TO MODEL . onnx", providers=['CUDAExecutionProvider'])
>>>
Run Code Online (Sandbox Code Playgroud)
也能做到这一点。
安装在虚拟环境中的 Python 包。
| 归档时间: |
|
| 查看次数: |
3121 次 |
| 最近记录: |