You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/train/train_mem.py", line 1, in <module>
from llava.train.train import train
File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/__init__.py", line 1, in <module>
from .model import LlavaLlamaForCausalLM
File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/model/__init__.py", line 1, in <module>
from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/model/language_model/llava_llama.py", line 23, in <module>
from transformers import AutoConfig, AutoModelForCausalLM, \
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1506, in __getattr__
value = getattr(module, name)
File "/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1505, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1517, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c106SymIntltEl
[2024-04-30 15:59:00,176] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141143
[2024-04-30 15:59:00,195] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141144
[2024-04-30 15:59:00,195] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141145
[2024-04-30 15:59:00,205] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141146
Best,
Orr
The text was updated successfully, but these errors were encountered:
orrzohar
changed the title
Installation Issue --
Installation Issue
May 1, 2024
Thank you for your interest in our work. The issue you are encountering is related to the installation of flash attention. The flash-attention is not properly installed. The following would help.
Make sure that the CUDA version of your machine and the CUDA version of PyTorch are same, and then try reinstalling the flash-attention.
If it does not solve the issue, try to install the Flash Attention from source following the instructions below,
git clone https://github.com/HazyResearch/flash-attention.git
cd flash-attention
python setup.py install
I followed your installation protocol:
I then run
And get:
Best,
Orr
The text was updated successfully, but these errors were encountered: