You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm looking at TE has commit code about FA3, it seems that only FP8 and FP16 are supported, because I only see these two data types in the test file, does it support BF16 mixed-precision training now?
The text was updated successfully, but these errors were encountered:
Hi @Desperadoze, we have added support for FA3 in FP16, BF16 and FP8 types, as of now. Please give 1.12 a try and let us know if there's any problems. FA3 FP16/BF16 have limited features compared to FA2, but for the features it does support, these tests,tests/pytorch/fused_attn/test_fused_attn.py, should be able to pick it up and run with FA3. Thanks.
I'm looking at TE has commit code about FA3, it seems that only FP8 and FP16 are supported, because I only see these two data types in the test file, does it support BF16 mixed-precision training now?
The text was updated successfully, but these errors were encountered: