Skip to content

Commit

Permalink
fixed mismatch between mask and batch dimensions (#6)
Browse files Browse the repository at this point in the history
  • Loading branch information
l-k-11235 authored Jun 6, 2024
1 parent 6d1e1be commit 6a0b111
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions eole/utils/loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -251,9 +251,8 @@ def ignore_prompt(self, batch):
batch: The current batch.
"""
# Create a mask with zeros at prompt positions and ones at answer postions.
mask = batch["src"].squeeze(dim=2) == self.padding_idx
mask = batch["src"].squeeze(dim=-1) == self.padding_idx
mask = torch.cumsum(mask.int(), 1)
mask = mask.unsqueeze(-1)
# Apply the mask on the target side.
batch["tgt"] *= mask.int()
# Put the padding token index at the prompt positions.
Expand Down

0 comments on commit 6a0b111

Please sign in to comment.