-
Notifications
You must be signed in to change notification settings - Fork 375
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"llava is already used by a transformers config" #111
Comments
we have auto installers and many times improved app just in case if you decide to use or try xformers 0.0.24 and torch 2.2.0 |
We don't have auto installers, you sell them. Please don't advertise. I already bought the subscription, and the one-click installer doesn't work. I incorporated the correct versions of xformers and torch, however, now there's no output. The only sign of life is the cursor spinning for a second. Here's cmd: C:\Users\Luke\Supir_Install\SUPIR>python gradio_demo.py C:\Users\Luke\Supir_Install\SUPIR>python gradio_demo.py --use_tile_vae --no_llava --use_image_slider --loading_half_params C:\Users\Luke\Supir_Install\SUPIR> I also tried to access http://127.0.0.1:6688 as chatgpt said that's the default IP and port it should be on, but no luck. Thanks for the help. |
Alright, I redownloaded your build. It worked this time, I probably had some dependency wrong. I troubleshooted it for hours, before resorting to following a Reddit guide. Somehow one click and it worked. Huh. Thanks for the help. I still don't like paywalling work on an open-source project. |
I'm still facing this issue, any open source options? |
Fixed in dms. |
This problem still exists, how to solve it?
|
Wrong issue? |
You need to downgrade Transformers library to 4.31.0. Huggingface already included llava to transformers library. See:haotian-liu/LLaVA#974 (comment) |
I am using xformers==0.0.26.post1, torch==2.3.0, torchvision==0.18.0, and torchaudio==2.3.0.
This is the error I'm getting. I've asked chatgpt, but it has no idea.
Can anybody help?
C:\Users\Luke\Supir_Install\SUPIR>python gradio_demo.py
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
Traceback (most recent call last):
File "C:\Users\Luke\Supir_Install\SUPIR\gradio_demo.py", line 11, in
from llava.llava_agent import LLavaAgent
File "C:\Users\Luke\Supir_Install\SUPIR\llava_init_.py", line 1, in
from .model import LlavaLlamaForCausalLM
File "C:\Users\Luke\Supir_Install\SUPIR\llava\model_init_.py", line 1, in
from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
File "C:\Users\Luke\Supir_Install\SUPIR\llava\model\language_model\llava_llama.py", line 139, in
AutoConfig.register("llava", LlavaConfig)
File "C:\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 981, in register
CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)
File "C:\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 680, in register
raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.")
ValueError: 'llava' is already used by a Transformers config, pick another name.
The text was updated successfully, but these errors were encountered: