Deep Live Cam Runs On CPU but not GPU #1033
Replies: 1 comment
-
Why has this been closed? I can't find any duplicate anywhere.
…________________________________
From: Kenneth Estanislao ***@***.***>
Sent: Monday, March 31, 2025 2:34 AM
To: hacksider/Deep-Live-Cam ***@***.***>
Cc: frankied1905 ***@***.***>; Author ***@***.***>
Subject: Re: [hacksider/Deep-Live-Cam] Deep Live Cam Runs On CPU but not GPU (Discussion #1033)
Closed #1033<#1033> as duplicate.
—
Reply to this email directly, view it on GitHub<#1033>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/BQ77O3TA2N2LNDK66YGUZY32XAFJHAVCNFSM6AAAAAB2CQJDO2VHI2DSMVQWIX3LMV45UABFIRUXGY3VONZWS33OIV3GK3TUHI5E433UNFTGSY3BORUW63R3GE4DMNBTHAZQ>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm having issues with running Deep Live Cam on the GPU. It works fine on the CPU but I've tried on 2 completely different computers and have had exactly the same problem. The details are posted below. I have tried installing cuDNN, which may have already been installed as it had no effect, CUDA version is correct at 11.8. The steps between CPU and GPU of simply downloading CUDA and reinstalling onnxruntime seem so straight forward. Surely I'm not the only one with this issue. Have grown so frustrated and the code below doesn't mean much to me. Can somebody please help me?
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
Install the latest PowerShell for new features and improvements! https://aka.ms/PSWindows
PS C:\Windows\system32> cd c:\users\deep-live-cam
PS C:\users\deep-live-cam> python run.py --execution-provider cuda
2025-03-31 00:00:05.2779722 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
EP Error D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
2025-03-31 00:00:05.3926089 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Python310\lib\tkinter_init_.py", line 1921, in call
return self.func(*args)
File "C:\Python310\lib\site-packages\customtkinter\windows\widgets\ctk_button.py", line 554, in _clicked
self._command()
File "C:\users\deep-live-cam\modules\ui.py", line 351, in
command=lambda: webcam_preview(
File "C:\users\deep-live-cam\modules\ui.py", line 792, in webcam_preview
create_webcam_preview(camera_index)
File "C:\users\deep-live-cam\modules\ui.py", line 911, in create_webcam_preview
source_image = get_one_face(cv2.imread(modules.globals.source_path))
File "C:\users\deep-live-cam\modules\face_analyser.py", line 28, in get_one_face
face = get_face_analyser().get(frame)
File "C:\users\deep-live-cam\modules\face_analyser.py", line 22, in get_face_analyser
FACE_ANALYSER = insightface.app.FaceAnalysis(name='buffalo_l', providers=modules.globals.execution_providers)
File "C:\Python310\lib\site-packages\insightface\app\face_analysis.py", line 31, in init
model = model_zoo.get_model(onnx_file, **kwargs)
File "C:\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
model = router.get_model(providers=providers, provider_options=provider_options)
File "C:\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
session = PickableInferenceSession(self.onnx_file, **kwargs)
File "C:\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in init
super().init(model_path, **kwargs)
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 430, in init
raise fallback_error from e
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 425, in init
self._create_inference_session(self._fallback_providers, None)
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
PS C:\users\deep-live-cam>
Beta Was this translation helpful? Give feedback.
All reactions