Replies: 1 comment
-
You could use the exponential of the MLM loss. You might also need to handle the cases where the bert tokenizer splits a token into to, and hence the perplexity should be re-combined for these words. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
i have been trying to calculate the 'score' of a sentence using BERT?
is this done here?
Beta Was this translation helpful? Give feedback.
All reactions