Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use BF16 on HPU by default #361

Merged
merged 1 commit into from
Oct 7, 2024

Conversation

kzawora-intel
Copy link

We don't officially support FP16, and for the most part, we use BF16 wherever we can. This removes the need of specifying --dtype bfloat16 - when dtype is not provided (is auto), and model default data type is float16, we cast it to bfloat16 for HPU.

@michalkuligowski michalkuligowski merged commit e00750e into habana_main Oct 7, 2024
19 checks passed
@kzawora-intel kzawora-intel deleted the private/kzawora/hpu_bf16_default branch October 7, 2024 12:51
@kzawora-intel kzawora-intel added the habana Issues or PRs submitted by Habana Labs label Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
habana Issues or PRs submitted by Habana Labs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants