Skip to content

Commit fa62dca

Browse files
allanjelusenji
authored andcommitted
Raise error and suggestion when using custom optimizer with Fairscale or Deepspeed (huggingface#16786)
* optimizer issues related to saving * remove the "optimizer saving" option * reformat using make style
1 parent 0f89c5c commit fa62dca

File tree

1 file changed

+7
-0
lines changed

1 file changed

+7
-0
lines changed

src/transformers/trainer.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -397,6 +397,13 @@ def __init__(
397397
"Passing a `model_init` is incompatible with providing the `optimizers` argument. "
398398
"You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method."
399399
)
400+
if (self.sharded_ddp is not None or args.deepspeed) and (
401+
self.optimizer is not None or self.lr_scheduler is not None
402+
):
403+
raise RuntimeError(
404+
"Passing `optimizers` is not allowed if Fairscale or Deepspeed is enabled."
405+
"You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method."
406+
)
400407
default_callbacks = DEFAULT_CALLBACKS + get_reporting_integration_callbacks(self.args.report_to)
401408
callbacks = default_callbacks if callbacks is None else default_callbacks + callbacks
402409
self.callback_handler = CallbackHandler(

0 commit comments

Comments
 (0)