Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Encountered ERROR: The NVIDIA Driver is present, but CUDA failed to initialize. GPU functionality will not be available. error after upgrading CUDA Toolkit to 12.5 #7379

Open
jackylu0124 opened this issue Jun 26, 2024 · 0 comments

Comments

@jackylu0124
Copy link

jackylu0124 commented Jun 26, 2024

Description
Previously I had CUDA Toolkit 12.1 on my Windows machine, and I was able to run the nvcr.io/nvidia/tritonserver:24.04-py3
Docker container with no issues at all. Today I uninstalled CUDA Toolkit 12.1 and installed the latest CUDA Toolkit 12.5, and when I tried to run the nvcr.io/nvidia/tritonserver:24.04-py3 Docker container, the inference server logs the following:

ERROR: The NVIDIA Driver is present, but CUDA failed to initialize.  GPU functionality will not be available.
   [[ Named symbol not found (error 500) ]]

Triton Information
nvcr.io/nvidia/tritonserver:24.04-py3

Are you using the Triton container or did you build it yourself?
I am using the nvcr.io/nvidia/tritonserver:24.04-py3 container.

Expected behavior
I should be able to run the inference server container as I was able to before.

Issue Reproduction
The following is the nvcc --version and nvidia-smi command output running from my host machine:

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2024 NVIDIA Corporation
Built on Wed_Apr_17_19:36:51_Pacific_Daylight_Time_2024
Cuda compilation tools, release 12.5, V12.5.40
Build cuda_12.5.r12.5/compiler.34177558_0
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 555.99                 Driver Version: 555.99         CUDA Version: 12.5     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                  Driver-Model | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 3090      WDDM  |   00000000:02:00.0  On |                  N/A |
| 30%   36C    P8             30W /  350W |    1731MiB /  24576MiB |      1%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+

The following is the nvcc --version and nvidia-smi command output running from inside the Docker container:

root@d34d08a6ce19:/opt/tritonserver# nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2024 NVIDIA Corporation
Built on Thu_Mar_28_02:18:24_PDT_2024
Cuda compilation tools, release 12.4, V12.4.131
Build cuda_12.4.r12.4/compiler.34097967_0
root@d34d08a6ce19:/opt/tritonserver# nvidia-smi
Wed Jun 26 22:30:47 2024
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 555.52.01              Driver Version: 555.99         CUDA Version: 12.5     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 3090        On  |   00000000:02:00.0  On |                  N/A |
| 30%   37C    P8             30W /  350W |    1728MiB /  24576MiB |      1%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+

Thanks for your help in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant