Skip to content

Conversation

jon-tow
Copy link
Collaborator

@jon-tow jon-tow commented Dec 14, 2022

This PR adds the following configuration updates:

  • Adds an OptimizerConfig dedicated to customizing optimizer specifications.

  • Adds a SchedulerConfig for customizing the learning rate scheduler.

    • Previously users were hard constrained to torch.optim.lr_scheduler.CosineAnnealingLR; this opens possibilities for other schedulers like torch.optim.lr_scheduler.OneCycleLR if desired from the community

Notes:

  • This PR formats the config YAML files throughout the repository to be more
    consistent.

wandb reports: PPO/ILQL Sentiments

@ reviewer: I found these updates useful to me; if this feature is not desired or designed appropriately feel free to suggest edits or close.

@jon-tow jon-tow requested a review from cat-state December 14, 2022 01:51
return dist_config


class OptimizerNames(Enum):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we not using registers...?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding a registry for str types seemed like a pre-mature abstraction given that the source is local to a single file. What exactly are you suggesting?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair, Dw then. Ready for merging imho.

@LouisCastricato LouisCastricato merged commit afdfdd9 into CarperAI:main Dec 14, 2022
@jon-tow jon-tow deleted the add-optim-config branch December 14, 2022 22:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants