Skip to content

Conversation

@kzawora-intel
Copy link

We don't officially support FP16, and for the most part, we use BF16 wherever we can. This removes the need of specifying --dtype bfloat16 - when dtype is not provided (is auto), and model default data type is float16, we cast it to bfloat16 for HPU.

@michalkuligowski michalkuligowski merged commit e00750e into habana_main Oct 7, 2024
19 checks passed
@kzawora-intel kzawora-intel deleted the private/kzawora/hpu_bf16_default branch October 7, 2024 12:51
@kzawora-intel kzawora-intel added the habana Issues or PRs submitted by Habana Labs label Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

habana Issues or PRs submitted by Habana Labs

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants