Skip to content

Commit 3c51075

Browse files
committed
fix: Update SFTConfig parameter
- Change max_seq_length to max_length in SFTConfig constructor - TRL deprecated max_seq_length in Feb 2024 and removed it in v0.20.0 - Reference: huggingface/trl#2895 This resolves the SFT training failure in CI tests
1 parent 870a37f commit 3c51075

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_stack/providers/inline/post_training/huggingface/recipes/finetune_single_device.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -469,7 +469,7 @@ def setup_training_args(
469469
use_cpu=True if device.type == "cpu" and not torch.backends.mps.is_available() else False,
470470
save_strategy=save_strategy,
471471
report_to="none",
472-
max_seq_length=provider_config.max_seq_length,
472+
max_length=provider_config.max_seq_length,
473473
gradient_accumulation_steps=config.gradient_accumulation_steps,
474474
gradient_checkpointing=provider_config.gradient_checkpointing,
475475
learning_rate=lr,

0 commit comments

Comments
 (0)