Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Official linked XTTS_v2 Google Colab throws: ibcublas.so.11 is not found #3779

Open
flocked opened this issue Jun 6, 2024 · 3 comments
Labels
bug Something isn't working wontfix This will not be worked on but feel free to help.

Comments

@flocked
Copy link

flocked commented Jun 6, 2024

Describe the bug

When using the Google Colab for XTTS_v2 that is linked in the official tutorial it throws an error after running the first two steps, opening the gradio UI via the public URL, uploading audio files and creating the dataset:

Library libcublas.so.11 is not found or cannot be loaded

To Reproduce

  1. Run the first two steps of the Google Colab
  2. Open the gradio UI via the public URL
  3. Upload audio files
  4. Create the dataset. It will throw the error listed in the log.

Expected behavior

No response

Logs

Loading Whisper Model!
2024-06-06 02:52:48,192 [INFO] Processing audio with duration 15:04.526
Traceback (most recent call last):
File "/content/TTS/TTS/demos/xtts_ft_demo/xtts_demo.py", line 215, in preprocess_dataset
train_meta, eval_meta, audio_total_size = format_audio_list(audio_path, target_language=language, out_path=out_path, gradio_progress=progress)
File "/content/TTS/TTS/demos/xtts_ft_demo/utils/formatter.py", line 75, in format_audio_list
segments = list(segments)
File "/usr/local/lib/python3.10/dist-packages/faster_whisper/transcribe.py", line 426, in generate_segments
encoder_output = self.encode(segment)
File "/usr/local/lib/python3.10/dist-packages/faster_whisper/transcribe.py", line 610, in encode
return self.model.encode(features, to_cpu=to_cpu)
RuntimeError: Library libcublas.so.11 is not found or cannot be loaded

Environment

Google Colab: https://colab.research.google.com/drive/1GiI4_X724M8q2W-zZ-jXo7cWTV7RfaH-?usp=sharing
From official tutorial: https://docs.coqui.ai/en/stable/models/xtts.html

Additional context

No response

@flocked flocked added the bug Something isn't working label Jun 6, 2024
@flocked
Copy link
Author

flocked commented Jun 6, 2024

Adding the code !apt install libcublas11 solves the problem.
pip install transformers -U has also be added to run the right version of it.

@saifulislam79
Copy link

This is CUDA Driver issue, please check the driver 12.2 or other version. Make sure cuda version and this "libcublas.so.11" for cuda 12.2 and it CUDNN is 8.x.x.

Copy link

stale bot commented Aug 2, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. You might also look our discussion channels.

@stale stale bot added the wontfix This will not be worked on but feel free to help. label Aug 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working wontfix This will not be worked on but feel free to help.
Projects
None yet
Development

No branches or pull requests

2 participants