-
-
Notifications
You must be signed in to change notification settings - Fork 4k
Open
Labels
Description
What happened?
Here is a minimal example showing it:
Run the proxy with Docker:
litellm_config.yaml
model_list:
- model_name: foo
litellm_params:
model: gpt-4.1
api_key: ...
general_settings:
master_key: sk-1234
docker run \
-v $(pwd)/litellm_config.yaml:/app/config.yaml \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.75.8-stable \
--config /app/config.yaml
Check which models are available:
curl 'http://0.0.0.0:4000/v1/models' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer sk-1234' | jq .
{
"data": [
{
"id": "foo",
"object": "model",
"created": 1677610602,
"owned_by": "openai"
}
],
"object": "list"
}
This request is expected to succeed:
curl -X POST 'http://0.0.0.0:4000/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer sk-1234' \
-d '{
"model": "foo",
"messages": [{"role": "user", "content": "Hi."}]
}'
The following request should not succeed, because there is no model with the name gpt-4.1
exposed in the config.
curl -X POST 'http://0.0.0.0:4000/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer sk-1234' \
-d '{
"model": "gpt-4.1",
"messages": [{"role": "user", "content": "Hi."}]
}'
However, it succeeds nonetheless, instead of failing with Invalid model name passed in
or similar.
Is this a bug or a feature?
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.75.8