Skip to content

Commit b5fc109

Browse files
schoennenbeckMu Huai
authored andcommitted
[Bugfix][Frontend] Fix validation of logprobs in ChatCompletionRequest (vllm-project#14352)
Signed-off-by: Sebastian Schönnenbeck <[email protected]> Signed-off-by: Mu Huai <[email protected]>
1 parent 4b875d6 commit b5fc109

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/entrypoints/openai/protocol.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -548,7 +548,7 @@ def check_logprobs(cls, data):
548548
if top_logprobs < 0:
549549
raise ValueError("`top_logprobs` must be a positive value.")
550550

551-
if not data.get("logprobs"):
551+
if top_logprobs > 0 and not data.get("logprobs"):
552552
raise ValueError(
553553
"when using `top_logprobs`, `logprobs` must be set to true."
554554
)

0 commit comments

Comments
 (0)