Skip to content

Commit 466bf48

Browse files
committed
update docs of length_penalty
1 parent 7152ed2 commit 466bf48

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

src/transformers/generation_utils.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -950,9 +950,9 @@ def generate(
950950
eos_token_id (`int`, *optional*):
951951
The id of the *end-of-sequence* token.
952952
length_penalty (`float`, *optional*, defaults to 1.0):
953-
Exponential penalty to the length. 1.0 means no penalty. Set to values < 1.0 in order to encourage the
954-
model to generate shorter sequences, to a value > 1.0 in order to encourage the model to produce longer
955-
sequences.
953+
Exponential penalty to the length. 1.0 means that the beam score is penalized by the sequence length. 0.0 means no penalty. Set to values < 0.0 in order to encourage the
954+
model to generate longer sequences, to a value > 0.0 in order to encourage the model to produce shorter
955+
sequences.
956956
no_repeat_ngram_size (`int`, *optional*, defaults to 0):
957957
If set to int > 0, all ngrams of that size can only occur once.
958958
encoder_no_repeat_ngram_size (`int`, *optional*, defaults to 0):

0 commit comments

Comments
 (0)