Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ComfyUI runs out of RAM when it should be using GPU #72

Open
edurenye opened this issue Dec 13, 2024 · 3 comments
Open

ComfyUI runs out of RAM when it should be using GPU #72

edurenye opened this issue Dec 13, 2024 · 3 comments

Comments

@edurenye
Copy link

Firstly, thank you very much for creating these images.

I'm facing this issue while I'm using the "yanwk/comfyui-boot:cu121" docker image with docker compose.
Running 'nvidia-smi' inside the container shows that it is using the GPU, and I used the first workflow from https://comfyanonymous.github.io/ComfyUI_examples/flux/ the "Flux Dev" workflow.

It seems to be working fine, all models are loaded, but although it uses some VRAM from the GPU it just goes up to 30% usage, and it is not using even half of the VRAM memory, while it consumes all my system RAM and breaks when it runs the SamplerCustomAdvanced node.

I have 34 GB of RAM and I have two GPUs the system uses one and the other is the one used by the comfyui docker image with 24GB of VRAM.

To me, it looks like an issue of this image, it should be using the VRAM, not the RAM, but maybe I am doing something wrong.

@YanWenKun
Copy link
Owner

I can't exactly reproduce your issue. But I guess it's related to the UNET workflow - when loading models, it's consuming all my 32G RAM as well, then free some RAM and start consuming 12G VRAM, then 100% GPU usage.

And I'm running fine with the checkpoint version workflow. So my ideas are:

  1. Use the checkpoint workflow.

  2. Use the schnell version if you prefer UNET workflow.

Some extra ideas (less likely helping the issue):

  1. Try cu124-megapak.

  2. Update GPU driver and NVIDIA Container Toolkit.

@edurenye
Copy link
Author

Thank you very much for your answer @YanWenKun !
I'll try this options, regarding the option 4, I use Ubuntu 24.04 and I already have the last versions available of the drivers, but for the option 3, I think "cu124-megapak" is too bulky, would be nice to have a slim version of cu124.

I read that fp8 and schnell reduce the quality of the generated images compared to original "Flux Dev", so I might end up buying more RAM if that is the case.

@edurenye
Copy link
Author

At the end the only solution was to increase the RAM memory, seems like first loads the models to the RAM and from there to the GPU. It's not a problem for me anymore, but could be a problem for others.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants