-
Notifications
You must be signed in to change notification settings - Fork 3.1k
[Trainer] update clear_grad #8829
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks for your contribution! |
97eb19d to
19311b5
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #8829 +/- ##
===========================================
+ Coverage 55.44% 55.51% +0.07%
===========================================
Files 631 631
Lines 98542 98545 +3
===========================================
+ Hits 54632 54710 +78
+ Misses 43910 43835 -75 ☔ View full report in Codecov by Sentry. |
wawltor
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| distributed_dataloader (`bool`, *optional*): | ||
| Whether to use distributed dataloader. Default is `False`. | ||
| release_grads (`bool`, *optional*): | ||
| Whether to release gradients during training. Default is `False`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
19311b5 to
45e8afd
Compare
ZHUI
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
Others
Description
Set self.optimizer.clear_grad(set_to_zero=False).