Bias Correction #90
-
Hello, The original BERT TF was pre-trained with bias correction set to False in the Adam optimizer. Is this also the case for how AraBERT was pre-trained? Thanks. |
Beta Was this translation helpful? Give feedback.
Answered by
WissamAntoun
Apr 1, 2021
Replies: 1 comment 1 reply
-
I guess yes, We also didn't notice the bias correction was not applied as in google-research/bert#153. |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
A-els
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I guess yes, We also didn't notice the bias correction was not applied as in google-research/bert#153.
The thing is we used the LAMB optimiser that was used in ALBERT which also doesn't have bias correction https://github.com/aub-mind/arabert/blob/master/arabert/lamb_optimizer.py#L98