Skip to content

Conversation

@ForFishes
Copy link
Member

PR types

Performance optimization

PR changes

Others

Description

[Cherry-pick] add comm buffer size (#8963)

@paddle-bot
Copy link

paddle-bot bot commented Aug 28, 2024

Thanks for your contribution!

@codecov
Copy link

codecov bot commented Aug 28, 2024

Codecov Report

Attention: Patch coverage is 33.33333% with 2 lines in your changes missing coverage. Please review.

Project coverage is 53.95%. Comparing base (34a71c8) to head (5ca021a).
Report is 229 commits behind head on develop.

Files with missing lines Patch % Lines
paddlenlp/trainer/training_args.py 33.33% 2 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #9031      +/-   ##
===========================================
+ Coverage    53.81%   53.95%   +0.14%     
===========================================
  Files          652      652              
  Lines       104356   104932     +576     
===========================================
+ Hits         56155    56614     +459     
- Misses       48201    48318     +117     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

default=-1,
metadata={"help": ("Sharding fused comm buffer size in communication between sharding ranks. ")},
)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wawltor wawltor merged commit ae691e2 into PaddlePaddle:develop Aug 29, 2024
@ForFishes ForFishes deleted the add_comm_buffer_size branch August 29, 2024 02:50
Mangodadada pushed a commit to Mangodadada/PaddleNLP that referenced this pull request Sep 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants