🆕 feat: support routing to only websearch supported deployments #17500
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
Support routing to only websearch supported deployments
Relevant issues
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsmake test-unitType
🆕 New Feature
Changes
Imagine you have a configuration like this, where you load balance
gpt-4.1between OpenAI and Azure:Works great, because your clients get the reliability of two providers. However, imagine one of them uses OpenAI's WebSearch tool, which works with gpt-4.1 on OpenAI, but not with Azure*:
If this request hits Azure it will fail! But if it hits OpenAI, it works. To make LiteLLM route all web search requests to OpenAI only, this PR allows you to define your LiteLLM config as follows:
You add
supports_web_search: Falseand now LiteLLM will selectively drop this deployment in routing for web search requests.Testing
Screenshot that my unit test works
Debug logs showing the filtering happen real time: