What are good configs for training UNet3DConditionModel on 8 GB VRAM? (64x64x64 inputs) #1818
-
What are good configs for training UNet3DConditionModel on 8 GB VRAM? (64x64x64 inputs) More specifically for this project I'm looking to use HuggingFace's UNet3DConditionModel on my home PC on a RTX 3060 Ti for voxel generation of small (64x64x64) models, but I'm struggling to find a good setup that would fit within the VRAM of my GPU. I doubt the model will be very effective with the space limitations I have so I'll probably swap to another model later, but I'd like to try out a setup with the UNet3DConditionModel first so any recommendations that don't require a more powerful computer? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
I think there is some confusion. PEFT is library that can help reduce the amount of memory when training models. Your question sounds like it aims at running the model. PEFT cannot help you with that. I don't know what packages you use for running your model, whether you use one of the UIs or diffusers. In case of a UI like automatic1111, search their options for VRAM saving settings like |
Beta Was this translation helpful? Give feedback.
Okay, so I did a count on the number of parameters per layer type on this model:
So most of the parameters are on
Conv2d
andLinear
, not onConv3d
(which is not supported), so in theory, using LoRA could be helpful.This is a big problem. PEFT is intended for fine-tuning, i.e. taking a pretrained model and adapting it to your specific problem. You will almost certainly not succeed when training from scratch. I imagine y…