We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TVM should find the LLVM ld.lld file
When running mlc_llm chat with JIT compiling on, TVM fails to find the LLVM installation, throwing RuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
mlc_llm chat
RuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
Testing in an MLC docker container with fresh installs of nightly
run mlc_llm chat HF://<model>, it will download the model, compile it, then crash when saving the .so file
mlc_llm chat HF://<model>
Line 55 of https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/rocm.py incorrectly forgets to add ld.lld (or whatever it finds in the lines above) to the /opt/rocm/llvm/bin path, which then returns None since os.path.isfile in https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/utils.py#L253 returns False when pointed at directories.
ld.lld
/opt/rocm/llvm/bin
None
os.path.isfile
The text was updated successfully, but these errors were encountered:
should be fixed upstream
Sorry, something went wrong.
No branches or pull requests
Expected behavior
TVM should find the LLVM ld.lld file
Actual behavior
When running
mlc_llm chat
with JIT compiling on, TVM fails to find the LLVM installation, throwingRuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
Environment
Testing in an MLC docker container with fresh installs of nightly
Steps to reproduce
run
mlc_llm chat HF://<model>
, it will download the model, compile it, then crash when saving the .so fileTriage
Line 55 of https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/rocm.py incorrectly forgets to add
ld.lld
(or whatever it finds in the lines above) to the/opt/rocm/llvm/bin
path, which then returnsNone
sinceos.path.isfile
in https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/utils.py#L253 returns False when pointed at directories.The text was updated successfully, but these errors were encountered: