Skip to content

Commit 2b4d7d1

Browse files
pramodithmarcandrelarochelle
authored andcommitted
[GRPO] Make sure special tokens aren't lost when truncating prompt. (huggingface#3651)
1 parent 0a38f19 commit 2b4d7d1

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

trl/trainer/grpo_trainer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1146,7 +1146,7 @@ def _generate_and_score_completions(
11461146
prompt_ids = prompt_ids[:, -self.max_prompt_length :]
11471147
prompt_mask = prompt_mask[:, -self.max_prompt_length :]
11481148
prompts_text = self.processing_class.batch_decode(
1149-
prompt_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False
1149+
prompt_ids, skip_special_tokens=False, clean_up_tokenization_spaces=False
11501150
)
11511151

11521152
# Generate completions using either vLLM or regular generation

0 commit comments

Comments
 (0)