-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't get cuda to work. #983
Comments
+1, experiencing this issue as well |
Ran into the same issue, idk but there seems to be a You can try this (bear in mind that i'm on CUDA 12) pip install torch torchaudio --index-url https://download.pytorch.org/whl/cu121 --force-reinstall --no-cache-dir
Warning Could not locate cudnn_ops_infer64_8.dll. Please make sure it is in your library path!
Important Since CTranslate 4.4.0 only supports up to cuDNN 8.x, make sure to grab the 8.x version compatible with CUDA 12
I think there's a simpler solution somewhere but after searching for hours this is what I got. |
So CUDA 12.1 works? I've been holding off on even trying to install whisperx since it says it supports 11.8 and other versions are YMMV. I'm not going to downgrade my whole env for 1 component. |
This works, FYI, the kernel will die if ran from a Jupyter Notebook. |
yes, it works. i've been using it with no problem. |
Thanks, appreciate it. |
Thanks @NefariousC your solution worked. I have CUDA 12.5 installed on my Windows 11 machine and had a perfectly working WhisperX 3.1 Conda environment until I decided stupidly to do an upgrade to 3.3.1. I ran into one problem after another, eventually deciding to trash the environment and create a new one, but of course the standard install doesnt work, because even though you have installed torch 2.0.0, the install of whisperx from PyPI overwrites it with 2.6.0 and it is the cpu version... But your steps worked exactly. I made two slight mods: Installed 12.4 |
@zos474 No worries, I was in the same boat as you. I encountered some issues like OP after blindly upgrading WhisperX. After running into errors even after reinstalling, I had to dig into Google and see what this error was about:
I was so confused because I thought I DID install the CUDA version, per what the setup docs said:
After surfing through forums, I found out about the Then I remembered WhisperX uses faster-whisper backend, which needs CTranslate2. Went back to the WhisperX docs and found this important bit: Important GPU execution requires the NVIDIA libraries cuBLAS 11.x and cuDNN 8.x to be installed on the system. Please refer to the CTranslate2 documentation Looks like I missed that tidbit when I was just yolo-ing the update. Lesson learned for me too. Have to thoroughly read the documentation before updating I guess 😂. |
Hi All. |
This is a PowerShell implementation of the patch-fix found at (m-bain/whisperX#983)
Had similar dependencies issue yesterday. Create venv with your python 3.10 : python3.10 -m venv whisperx311 Then install whisperx==3.1.1 : E:\Codes\virtual_envs\whisperx311\Scripts\python -m pip install whisperx==3.1.1 Create following requirements.txt :
Then E:\Codes\virtual_envs\whisperx311\Scripts\python -m pip install -r requirements.txt At this point you should have no dependency resolver errors and a working whisperx venv (whisperx===3.1.1) |
Whatever I do I get Torch not compiled with CUDA, I've followed the instructions and installed as written.
miniconda3\envs\whisperx\lib\site-packages\torch\cuda_init_.py", line 310, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
When running pip install whisperx it installs torch without cuda enabled. I'm running this inside the conda environment.
I'm not really sure how the get this to work, been trying for ages now.
It also install torch 2.5.0, but the conda install is 2.0.0 before the "pip install whisperx" in the description.
Is Setup in description outdated?
Is there any difference except maybe speed? CPU or Cuda?
I ran whisperx on a movie with int8 enabled, and it was almost correct, some timestamps where completely wrong, just random places.
I could not get float 16 to work, trying CPU and float 32 now.
I have 4060ti.
The text was updated successfully, but these errors were encountered: