How to set a layer like LayerNorm to fp32 while fp16 is enabled? #5678
Unanswered
forrestjgq
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
To get a better precision, sometimes I have to force some layer like LayerNorm to be fp32 and other layers be fp16, in pytorch training, I could use
torch.cuda.amp.autocast
and convert layer input to fp32 to train, but in deepspeed how could I do this?Beta Was this translation helpful? Give feedback.
All reactions