-
-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
When Instructor patches a LiteLLM client using Mistral Structured Outputs, the request fails with a
MistralException - Unsupported response_format type - type='json_schema' error.
import instructor
import litellm
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: int
city: str
client = instructor.from_litellm(litellm.completion, mode=instructor.Mode.MISTRAL_STRUCTURED_OUTPUTS)
response = client.create(
model="mistral/mistral-large-latest",
response_model=Person,
messages=[{"role": "user", "content": "Extract a person from: John is 30 years old and lives in New York."}],
)
print(response)Here are the relevant exception traces
> python example.py
...
Traceback (most recent call last):
File "/Users/bilelomrani/Documents/Dynamo.nosync/syndata-pii/.venv/lib/python3.12/site-packages/litellm/main.py", line 1318, in completion
optional_params = get_optional_params(
^^^^^^^^^^^^^^^^^^^^
File "/Users/bilelomrani/Documents/Dynamo.nosync/syndata-pii/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3389, in get_optional_params
non_default_params = pre_process_non_default_params(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/bilelomrani/Documents/Dynamo.nosync/syndata-pii/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3194, in pre_process_non_default_params
non_default_params["response_format"] = type_to_response_format_param(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/bilelomrani/Documents/Dynamo.nosync/syndata-pii/.venv/lib/python3.12/site-packages/litellm/llms/base_llm/base_utils.py", line 193, in type_to_response_format_param
raise TypeError(f"Unsupported response_format type - {response_format}")
TypeError: Unsupported response_format type - type='json_schema' json_schema=JSONSchema(name='Person', schema_definition={'properties': {'name': {'title': 'Name', 'type': 'string'}, 'age': {'title': 'Age', 'type': 'integer'}, 'city': {'title': 'City', 'type': 'string'}}, 'required': ['name', 'age', 'city'], 'title': 'Person', 'type': 'object', 'additionalProperties': False}, description=Unset(), strict=True)
Alternatively, when the same Instructor patching approach is used directly on the official Mistral SDK, it works correctly:
import os
import instructor
from mistralai import Mistral
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: int
city: str
client = instructor.from_mistral(
Mistral(api_key=os.environ["MISTRAL_API_KEY"]), mode=instructor.Mode.MISTRAL_STRUCTURED_OUTPUTS
)
response = client.create(
model="mistral-large-latest",
response_model=Person,
messages=[{"role": "user", "content": "Extract a person from: John is 30 years old and lives in New York."}],
)
print(response)> python example2.py
name='John' age=30 city='New York'
It appears that LiteLLM currently doesn’t support passing a response_format parameter of type json_schema to Mistral endpoints, which Instructor relies on for structured outputs. When Instructor wraps litellm.completion, it attempts to use the same Mistral JSON schema response format but LiteLLM’s Mistral adapter doesn’t recognize or serialize it correctly.
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.79.0
Twitter / LinkedIn details
No response
anapaulagomes
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working