-
Notifications
You must be signed in to change notification settings - Fork 26.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to import transformers #31658
Comments
Hello! Interestingly enough, this looks linked to It seems like |
It seems like others have a similar issue as well: https://discuss.pytorch.org/t/failed-to-import-pytorch-fbgemm-dll-or-one-of-its-dependencies-is-missing/201969 |
Hello,
I have checked it in pytorch folder, the file named fbgemm.dll is present.
What shall I do further?
…On Thu, Jun 27, 2024 at 6:46 PM Lysandre Debut ***@***.***> wrote:
Hello! Interestingly enough, this looks linked to torch rather than
transformers. Do you manage to do import torch?
It seems like fbgemm.dll is missing from your torch install
—
Reply to this email directly, view it on GitHub
<#31658 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A2Y5FVNGFFU6PKIFHZWERCDZJQ6VJAVCNFSM6AAAAABKADDFVOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOJVGE4TOOJUHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Kindly revert back as soon as possible. I tried every possible solution
from my end.
On Thu, Jun 27, 2024 at 8:25 PM MAITRI DEEPAK SAVLA -IIITK <
***@***.***> wrote:
… Hello,
I have checked it in pytorch folder, the file named fbgemm.dll is
present.
What shall I do further?
On Thu, Jun 27, 2024 at 6:46 PM Lysandre Debut ***@***.***>
wrote:
> Hello! Interestingly enough, this looks linked to torch rather than
> transformers. Do you manage to do import torch?
>
> It seems like fbgemm.dll is missing from your torch install
>
> —
> Reply to this email directly, view it on GitHub
> <#31658 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/A2Y5FVNGFFU6PKIFHZWERCDZJQ6VJAVCNFSM6AAAAABKADDFVOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOJVGE4TOOJUHA>
> .
> You are receiving this because you authored the thread.Message ID:
> ***@***.***>
>
|
Hello @MaitriSavla2003, there is unfortunately nothing to revert from our end: no changes are linked to the issue you're facing. Your issues stem from your environment not being configured correctly at the PyTorch level, and is therefore not linked to transformers. |
|
System Info
transformers version: 4.41.2
torch version: 2.3.1
pip version: 24.1.1
python version: 3.12.3
Error Thrown while trying to import transformers:
OSError Traceback (most recent call last)
Cell In[4], line 1
----> 1 import transformers
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers_init_.py:26
23 from typing import TYPE_CHECKING
25 # Check the dependencies satisfy the minimal versions required.
---> 26 from . import dependency_versions_check
27 from .utils import (
28 OptionalDependencyNotAvailable,
29 _LazyModule,
(...)
48 logging,
49 )
52 logger = logging.get_logger(name) # pylint: disable=invalid-name
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers\dependency_versions_check.py:16
1 # Copyright 2020 The HuggingFace Team. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
(...)
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
15 from .dependency_versions_table import deps
---> 16 from .utils.versions import require_version, require_version_core
19 # define which module versions we always want to check at run time
20 # (usually the ones defined in
install_requires
in setup.py)21 #
22 # order specific notes:
23 # - tqdm must be checked before tokenizers
25 pkgs_to_check_at_runtime = [
26 "python",
27 "tqdm",
(...)
37 "pyyaml",
38 ]
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers\utils_init_.py:33
24 from .constants import IMAGENET_DEFAULT_MEAN, IMAGENET_DEFAULT_STD, IMAGENET_STANDARD_MEAN, IMAGENET_STANDARD_STD
25 from .doc import (
26 add_code_sample_docstrings,
27 add_end_docstrings,
(...)
31 replace_return_docstrings,
32 )
---> 33 from .generic import (
34 ContextManagers,
35 ExplicitEnum,
36 ModelOutput,
37 PaddingStrategy,
38 TensorType,
39 add_model_info_to_auto_map,
40 cached_property,
41 can_return_loss,
42 expand_dims,
43 find_labels,
44 flatten_dict,
45 infer_framework,
46 is_jax_tensor,
47 is_numpy_array,
48 is_tensor,
49 is_tf_symbolic_tensor,
50 is_tf_tensor,
51 is_torch_device,
52 is_torch_dtype,
53 is_torch_tensor,
54 reshape,
55 squeeze,
56 strtobool,
57 tensor_size,
58 to_numpy,
59 to_py_obj,
60 transpose,
61 working_or_temp_dir,
62 )
63 from .hub import (
64 CLOUDFRONT_DISTRIB_PREFIX,
65 HF_MODULES_CACHE,
(...)
91 try_to_load_from_cache,
92 )
93 from .import_utils import (
94 ACCELERATE_MIN_VERSION,
95 ENV_VARS_TRUE_AND_AUTO_VALUES,
(...)
210 torch_only_method,
211 )
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers\utils\generic.py:461
457 return tuple(self[k] for k in self.keys())
460 if is_torch_available():
--> 461 import torch.utils._pytree as _torch_pytree
463 def _model_output_flatten(output: ModelOutput) -> Tuple[List[Any], "_torch_pytree.Context"]:
464 return list(output.values()), list(output.keys())
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\torch_init_.py:143
141 err = ctypes.WinError(ctypes.get_last_error())
142 err.strerror += f' Error loading "{dll}" or one of its dependencies.'
--> 143 raise err
145 kernel32.SetErrorMode(prev_error_mode)
148 def _preload_cuda_deps(lib_folder, lib_name):
OSError: [WinError 126] The specified module could not be found. Error loading "c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\torch\lib\fbgemm.dll" or one of its dependencies.
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Error Thrown while trying to import transformers:
OSError Traceback (most recent call last)
Cell In[4], line 1
----> 1 import transformers
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers_init_.py:26
23 from typing import TYPE_CHECKING
25 # Check the dependencies satisfy the minimal versions required.
---> 26 from . import dependency_versions_check
27 from .utils import (
28 OptionalDependencyNotAvailable,
29 _LazyModule,
(...)
48 logging,
49 )
52 logger = logging.get_logger(name) # pylint: disable=invalid-name
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers\dependency_versions_check.py:16
1 # Copyright 2020 The HuggingFace Team. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
(...)
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
15 from .dependency_versions_table import deps
---> 16 from .utils.versions import require_version, require_version_core
19 # define which module versions we always want to check at run time
20 # (usually the ones defined in
install_requires
in setup.py)21 #
22 # order specific notes:
23 # - tqdm must be checked before tokenizers
25 pkgs_to_check_at_runtime = [
26 "python",
27 "tqdm",
(...)
37 "pyyaml",
38 ]
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers\utils_init_.py:33
24 from .constants import IMAGENET_DEFAULT_MEAN, IMAGENET_DEFAULT_STD, IMAGENET_STANDARD_MEAN, IMAGENET_STANDARD_STD
25 from .doc import (
26 add_code_sample_docstrings,
27 add_end_docstrings,
(...)
31 replace_return_docstrings,
32 )
---> 33 from .generic import (
34 ContextManagers,
35 ExplicitEnum,
36 ModelOutput,
37 PaddingStrategy,
38 TensorType,
39 add_model_info_to_auto_map,
40 cached_property,
41 can_return_loss,
42 expand_dims,
43 find_labels,
44 flatten_dict,
45 infer_framework,
46 is_jax_tensor,
47 is_numpy_array,
48 is_tensor,
49 is_tf_symbolic_tensor,
50 is_tf_tensor,
51 is_torch_device,
52 is_torch_dtype,
53 is_torch_tensor,
54 reshape,
55 squeeze,
56 strtobool,
57 tensor_size,
58 to_numpy,
59 to_py_obj,
60 transpose,
61 working_or_temp_dir,
62 )
63 from .hub import (
64 CLOUDFRONT_DISTRIB_PREFIX,
65 HF_MODULES_CACHE,
(...)
91 try_to_load_from_cache,
92 )
93 from .import_utils import (
94 ACCELERATE_MIN_VERSION,
95 ENV_VARS_TRUE_AND_AUTO_VALUES,
(...)
210 torch_only_method,
211 )
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\transformers\utils\generic.py:461
457 return tuple(self[k] for k in self.keys())
460 if is_torch_available():
--> 461 import torch.utils._pytree as _torch_pytree
463 def _model_output_flatten(output: ModelOutput) -> Tuple[List[Any], "_torch_pytree.Context"]:
464 return list(output.values()), list(output.keys())
File c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\torch_init_.py:143
141 err = ctypes.WinError(ctypes.get_last_error())
142 err.strerror += f' Error loading "{dll}" or one of its dependencies.'
--> 143 raise err
145 kernel32.SetErrorMode(prev_error_mode)
148 def _preload_cuda_deps(lib_folder, lib_name):
OSError: [WinError 126] The specified module could not be found. Error loading "c:\Users\Maitri\Desktop\Text-Summarizer\env\Lib\site-packages\torch\lib\fbgemm.dll" or one of its dependencies.
Expected behavior
I was trying to import AutoTokenizer from the transformers module, but I was facing the same error. So to verify, I tried to import transformers and found that the transformers library itself is not working properly. I am doing my project in virtual environment.
The text was updated successfully, but these errors were encountered: