Skip to content

Conversation

muellerzr
Copy link
Contributor

@muellerzr muellerzr commented May 2, 2022

Update no_trainer examples to use the new Accelerate logger

What does this add?

  • Accelerate recently added a new logger to help deal with repeat logs across all processes. If it should be logged on all, a new kwarg main_process_only=False should be passed in.

This helps also solve an annoyance users were pointing out about repeat logs leading to misunderstandings of how the internal API was acting on

What parts of the API does this impact?

User-facing:

The examples now show using the new get_logger() function from Accelerate

@muellerzr muellerzr added Examples Which is related to examples in general PyTorch Anything PyTorch labels May 2, 2022
@muellerzr muellerzr requested a review from sgugger May 2, 2022 14:14
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented May 2, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice! You can propagate this change on the other example scripts!

@muellerzr muellerzr requested a review from sgugger May 2, 2022 15:28
@muellerzr
Copy link
Contributor Author

rereview for propagate sanity check then all good 🤗

@muellerzr muellerzr removed the request for review from sgugger May 2, 2022 15:47
@muellerzr muellerzr merged commit 35d48db into main May 2, 2022
@muellerzr muellerzr deleted the muellerzr-example-logs branch May 2, 2022 15:56
stevhliu pushed a commit to stevhliu/transformers that referenced this pull request May 3, 2022
elusenji pushed a commit to elusenji/transformers that referenced this pull request Jun 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Examples Which is related to examples in general PyTorch Anything PyTorch
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants