You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/my-website/docs/completion/drop_params.md
+8Lines changed: 8 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,6 +5,14 @@ import TabItem from '@theme/TabItem';
5
5
6
6
Drop unsupported OpenAI params by your LLM Provider.
7
7
8
+
## Default Behavior
9
+
10
+
**By default, LiteLLM raises an exception** if you send a parameter to a model that doesn't support it.
11
+
12
+
For example, if you send `temperature=0.2` to a model that doesn't support the `temperature` parameter, LiteLLM will raise an exception.
13
+
14
+
**When `drop_params=True` is set**, LiteLLM will drop the unsupported parameter instead of raising an exception. This allows your code to work seamlessly across different providers without having to customize parameters for each one.
0 commit comments