ubuntu@deeplearn-ubuntu:~$ nvidia-detector
Traceback (most recent call last):
File "/bin/nvidia-detector", line 8, in <module>
a = NvidiaDetection(printonly=True, verbose=False)
File "/usr/lib/python3/dist-packages/NvidiaDetector/nvidiadetector.py", line 73, in __init__
self.getData()
File "/usr/lib/python3/dist-packages/NvidiaDetector/nvidiadetector.py", line 163, in getData
driver_version = self.__get_value_from_name(stripped_package_name)
File "/usr/lib/python3/dist-packages/NvidiaDetector/nvidiadetector.py", line 92, in __get_value_from_name
v = int(name)
ValueError: invalid literal for int() with base 10: 'server'
Run Code Online (Sandbox Code Playgroud)