Skip to content

Commit 7dbf477

Browse files
pramodithqgallouedec
authored andcommitted
[GRPO] Make sure special tokens aren't lost when truncating prompt. (#3651)
1 parent 84982ad commit 7dbf477

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

trl/trainer/grpo_trainer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1065,7 +1065,7 @@ def _generate_and_score_completions(
10651065
prompt_ids = prompt_ids[:, -self.max_prompt_length :]
10661066
prompt_mask = prompt_mask[:, -self.max_prompt_length :]
10671067
prompts_text = self.processing_class.batch_decode(
1068-
prompt_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False
1068+
prompt_ids, skip_special_tokens=False, clean_up_tokenization_spaces=False
10691069
)
10701070

10711071
# Generate completions using either vLLM or regular generation

0 commit comments

Comments
 (0)