You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey, I recently tried to use PiSSA init during the LoRA finetuing process, however I noticed that the time takes to init with PiSSA is so long.
I set all linear layer as my LoRA target layer and the Rank of the LoRA is 32. I used "pissa_niter_4" as the init_lora_weights argument, but seems it still takes about 20-30 minutes, is there any solutions for this ? Many thanks!
The text was updated successfully, but these errors were encountered:
Hey thanks for the reply. I used the DeepSeek-v1.5 for the finetuning. One thing I observed is that: if I set the 'bias' as 'lora-only' when using pissa, the unit process will be slow, but if I set it to none, the init process of pissa will be much faster. BTW
I use a Lora rank of "32"
Same problem. It's even quite when I was finetuning Roberta.
I guess it's related to the CPU? Cause this happens on my Intel(R) Xeon(R) Platinum 8268 CPU @ 2.90GHz machine, but much faster on the Intel(R) Xeon(R) Platinum 8458P machine.
Hey, I recently tried to use PiSSA init during the LoRA finetuing process, however I noticed that the time takes to init with PiSSA is so long.
I set all linear layer as my LoRA target layer and the Rank of the LoRA is 32. I used "pissa_niter_4" as the init_lora_weights argument, but seems it still takes about 20-30 minutes, is there any solutions for this ? Many thanks!
The text was updated successfully, but these errors were encountered: