Skip to content

[Trainer.train] learning rate logging inconsistency: learning rate for the future step is logged #28124

@HanGuo97

Description

@HanGuo97

System Info

NA

Who can help?

@muellerzr and @pacman100

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

This line of code steps forward the LR scheduler, before _maybe_log_save_evaluate is called. This means the learning rate logged represents the learning in the upcoming iteration.

For most of the use cases, the differences between them is small. However, in certain cases, this caused confusion.

Expected behavior

The learning rate for the current iteration is logged.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions