-
Couldn't load subscription status.
- Fork 5.9k
Support initializing specific grad tensors to zero for selected operators #39963
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support initializing specific grad tensors to zero for selected operators #39963
Conversation
|
Thanks for your contribution! |
|
Sorry to inform you that 753798e's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually. |
… into support_complex
… into support_fill_zero
79859f5 to
27991c5
Compare
… into support_fill_zero
… into support_fill_zero
… support_fill_zero
… support_fill_zero
… support_fill_zero
… support_fill_zero
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
New features
PR changes
Others
Describe
Taking split_op as an example, the grad operation of which is to concat all the grad inputs into one. If, however, any of the grad inputs is uninitialized, we initialize it with zero value by default