Skip to content

Commit

Permalink
DOC FIX Comment about init of LoRA Embedding (#1855)
Browse files Browse the repository at this point in the history
Fixes #1728
  • Loading branch information
BenjaminBossan committed Jun 13, 2024
1 parent 1946135 commit d608f83
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion src/peft/tuners/lora/layer.py
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,8 @@ def reset_lora_parameters(self, adapter_name, init_lora_weights):
raise ValueError(f"Unknown initialization {init_lora_weights=}")
nn.init.zeros_(self.lora_B[adapter_name].weight)
if adapter_name in self.lora_embedding_A.keys():
# initialize a the same way as the default for nn.linear and b to zero
# Initialize A to zeros and B the same way as the default for nn.Embedding, see:
# https://github.com/microsoft/LoRA/blob/4c0333854cb905966f8cc4e9a74068c1e507c7b7/loralib/layers.py#L59-L60
nn.init.zeros_(self.lora_embedding_A[adapter_name])
nn.init.normal_(self.lora_embedding_B[adapter_name])

Expand Down

0 comments on commit d608f83

Please sign in to comment.