-
Notifications
You must be signed in to change notification settings - Fork 540
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Undefined symbol: __nvJitLinkAddData_12_1, version libnvJitLink.so.12 #1700
Comments
This error should be because pytorch was compiled for a later version of cuda. |
WG agrees this is resolved |
Closing |
Hi, I am facing the same issue with the gpt-j benchmark. torch.distributed not initialized, assuming single world_size.
Quantized model exported to /mnt/models/GPTJ-6B/fp8-quantized-ammo/GPTJ-FP8-quantized
Total time used 14.93 s.
make: Leaving directory '/home/usr/CM/repos/local/cache/e6f880f23ece4993/repo/docker'
Traceback (most recent call last):
File "/home/usr/CM/repos/local/cache/c69993974e204ca1/repo/closed/NVIDIA/code/gptj/tensorrt/onnx_tune.py", line 15, in <module>
import torch
File "/home/usr/Benchmark/cm/lib/python3.12/site-packages/torch/__init__.py", line 367, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home/usr/Benchmark/cm/lib/python3.12/site-packages/torch/lib/../../nvidia/cusparse/lib/libcusparse.so.12: undefined symbol: __nvJitLinkComplete_12_4, version libnvJitLink.so.12
CM error: Portable CM script failed (name = get-ml-model-gptj, return code = 256) with the command: I have tried to add |
Occured while running:
cm run script --tags=generate-run-cmds,inference --model=bert-99 --backend=pytorch --mode=performance --device=cuda --quiet --test_query_count=1000 --network=sut
Found same issue created at: pytorch/pytorch#111469
I have tried to export env variable:
export LD_LIBRARY_PATH=$HOME/.local/lib/python3.12/sitepackages/nvidia/nvjitlink:$LD_LIBRARY_PATH
but did not workAnother method mentioned:
The text was updated successfully, but these errors were encountered: