小编tin*_*ani的帖子

我有 ValueError: invalid literal for int() with base 10: 'server' when I enter nvidia-detector

ubuntu@deeplearn-ubuntu:~$ nvidia-detector 
Traceback (most recent call last):
  File "/bin/nvidia-detector", line 8, in <module>
    a = NvidiaDetection(printonly=True, verbose=False)
  File "/usr/lib/python3/dist-packages/NvidiaDetector/nvidiadetector.py", line 73, in __init__
    self.getData()
  File "/usr/lib/python3/dist-packages/NvidiaDetector/nvidiadetector.py", line 163, in getData
    driver_version = self.__get_value_from_name(stripped_package_name)
  File "/usr/lib/python3/dist-packages/NvidiaDetector/nvidiadetector.py", line 92, in __get_value_from_name
    v = int(name)
ValueError: invalid literal for int() with base 10: 'server'
Run Code Online (Sandbox Code Playgroud)

nvidia nvidia-settings

3
推荐指数
1
解决办法
2016
查看次数

标签 统计

nvidia ×1

nvidia-settings ×1