Skip to content

Conversation

@zhanglirong1999
Copy link
Contributor

PR Category

Operator Mechanism

PR Types

Bug fixes

Description

We found in bf16, prelu kernel will failed. Since in bf16, many inputs such as filter, bias, scale, alpha do not require quantization.
But Prelu kernel input 'alpha', it use x data type as it's data type, which will failed if x is bf16.
Refer to clip, fc, layernorm kernel, such kinds of input can use fp32 as it's type directly.

@paddle-bot
Copy link

paddle-bot bot commented Aug 6, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@zhanglirong1999 zhanglirong1999 changed the title [OneDNN] Fix Prelu kernel failed in bf16 since data type [OneDNN] Fix Prelu kernel failed in bf16 since data type error Aug 6, 2024
@paddle-bot paddle-bot bot added the contributor External developers label Aug 6, 2024
Copy link
Contributor

@LLee233 LLee233 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhanglirong1999 zhanglirong1999 merged commit 7f828ab into PaddlePaddle:develop Aug 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor External developers Intel

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants