Need more information/documentation around what lora specific hyperparameters mean #1174
Unanswered
nayan-dhabarde
asked this question in
Q&A
Replies: 1 comment
-
Indeed choosing the right hyper-parameters when using LoRA is a very important topic. Another hyper-parameter you can use is the weights initialization strategy introduced by: #1189 We found out that when using |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have used dreambooth for training in the past and that worked really well for me.
It looks like same hyperparameter values wont work in case of Lora.
Additionally, I don't know what these parameter do
--lora_r 16
--lora_alpha 27
--lora_text_encoder_r 16
--lora_text_encoder_alpha 17
Can someone help my understand how increasing this or decreasing this will affect the results
Also, when training for faces, what is best learning rate, lr schedule value and number of steps
I am using 2000 steps with 2000 reg images and 10 input imagesz
Beta Was this translation helpful? Give feedback.
All reactions