You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[BFCL] Omit Reasoning Content from Chat History for Function-Calling Models (#1064)
OpenAI models do not return reasoning content within API responses.
However, other models using the OpenAI-compatible interface, such as
DeepSeek, include reasoning details in their responses.
These reasoning contents are typically not intended for inclusion in
subsequent chat turns. This PR addresses this behavior by updating the
handler to store any available reasoning content into response_data
(primarily for local result logging), while ensuring that reasoning
content does not propagate into the chat history.
This approach has previously been implemented for prompt-based models.
This PR extends that logic to also support function-calling models.
DeepSeek does not take reasoning content in next turn chat history, for both prompting and function calling mode.
123
+
Error: Error code: 400 - {'error': {'message': 'The reasoning_content is an intermediate result for display purposes only and will not be included in the context for inference. Please remove the reasoning_content from your message to reduce network traffic.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
DeepSeek does not take reasoning content in next turn chat history, for both prompting and function calling mode.
133
+
Error: Error code: 400 - {'error': {'message': 'The reasoning_content is an intermediate result for display purposes only and will not be included in the context for inference. Please remove the reasoning_content from your message to reduce network traffic.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
0 commit comments