Skip to content

Conversation

@kzawora-intel
Copy link

@kzawora-intel kzawora-intel commented Jan 29, 2025

There's no reason for current attention head sizes restrictions - we theoretically can support any size with current implementations. This patch fixes that.

There's no reason for current attention head sizes restrictions - we theoretically can support any size with current implementations. This patch fixes that.
Copy link

@afierka-intel afierka-intel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@kzawora-intel kzawora-intel merged commit 2d152ed into habana_main Jan 29, 2025
26 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants