Skip to content

Conversation

@raghav-stripe
Copy link
Contributor

@raghav-stripe raghav-stripe commented Dec 4, 2025

Title

Support routing to only websearch supported deployments

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

Imagine you have a configuration like this, where you load balance gpt-4.1 between OpenAI and Azure:

  - model_name: gpt-4.1
    litellm_params:
      model: openai/gpt-4.1
  - model_name: gpt-4.1
    litellm_params:
      model: azure/gpt-4.1
      api_base: "https://x.openai.azure.com/"
      api_version: 2025-03-01-preview

Works great, because your clients get the reliability of two providers. However, imagine one of them uses OpenAI's WebSearch tool, which works with gpt-4.1 on OpenAI, but not with Azure*:

curl --verbose -X POST localhost:4000/v1/responses \
  -H 'Authorization: Bearer sk-x' \
  -H 'Content-Type: application/json' \
  -d '{"model": "gpt-4.1", "input": "What was a positive news story from today?",
    "tools": [{"type":"web_search"}]
}'

If this request hits Azure it will fail! But if it hits OpenAI, it works. To make LiteLLM route all web search requests to OpenAI only, this PR allows you to define your LiteLLM config as follows:

  - model_name: gpt-4.1
    litellm_params:
      model: openai/gpt-4.1
  - model_name: gpt-4.1
    litellm_params:
      model: azure/gpt-4.1
      api_base: "x.openai.azure.com/"
      api_version: 2025-03-01-preview
    model_info:
      supports_web_search: False <---- KEY CHANGE!

You add supports_web_search: False and now LiteLLM will selectively drop this deployment in routing for web search requests.

  • Note Azure does support WebSearch with gpt-4.1 as of recent, but for us, we don't have this feature enabled, and only support it via OpenAI.

Testing

Screenshot that my unit test works

image

Debug logs showing the filtering happen real time:

16:58:30 - LiteLLM Router:DEBUG: router.py:7392 - initial list of deployments: [{'model_name': 'gpt-4.1', 'litellm_params': {'use_in_pass_through': False, 'use_litellm_proxy': False, 'merge_reasoning_content_in_choices': False, 'model': 'openai/gpt-4.1'}, 'model_info': {'id': '...', 'db_model': False}}, {'model_name': 'gpt-4.1', 'litellm_params': {'api_base': 'https://x.openai.azure.com/', 'api_version': '2025-03-01-preview', 'use_in_pass_through': False, 'use_litellm_proxy': False, 'merge_reasoning_content_in_choices': False, 'model': 'azure/gpt-4.1'}, 'model_info': {'id': '....', 'db_model': False, 'supports_web_search': False}}]

16:58:30 - LiteLLM Router:DEBUG: router.py:7465 - healthy_deployments after team filter: [{'model_name': 'gpt-4.1', 'litellm_params': {'use_in_pass_through': False, 'use_litellm_proxy': False, 'merge_reasoning_content_in_choices': False, 'model': 'openai/gpt-4.1'}, 'model_info': {'id': '...', 'db_model': False}}, {'model_name': 'gpt-4.1', 'litellm_params': {'api_base': 'https://x.openai.azure.com/', 'api_version': '2025-03-01-preview', 'use_in_pass_through': False, 'use_litellm_proxy': False, 'merge_reasoning_content_in_choices': False, 'model': 'azure/gpt-4.1'}, 'model_info': {'id': '...', 'db_model': False, 'supports_web_search': False}}]

16:58:30 - LiteLLM Router:DEBUG: router.py:7472 - healthy_deployments after web search filter: [{'model_name': 'gpt-4.1', 'litellm_params': {'use_in_pass_through': False, 'use_litellm_proxy': False, 'merge_reasoning_content_in_choices': False, 'model': 'openai/gpt-4.1'}, 'model_info': {'id': '...', 'db_model': False}}]

@vercel
Copy link

vercel bot commented Dec 4, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Dec 4, 2025 10:18pm

if "supports_web_search" in model_info:
return model_info["supports_web_search"]

return True
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this return True I did on purpose. By default, we should assume the deployment supports WebSearch so we don't break existing LiteLLM deployments.

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add this to docs here: https://docs.litellm.ai/docs/completion/web_search

You can make a section called "Advanced" under there add "web search routing"

Explain the problem this solves and show how to use it

optional add a mermaid diagram to explain this filtering

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@ishaan-jaff ishaan-jaff merged commit 72eb4c3 into BerriAI:main Dec 4, 2025
5 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants