Skip to content

[Bug]: Error while using allowed_openai_params with openrouter #14273

@bgeneto

Description

@bgeneto

What happened?

If you configure a Openrouter model with allowed_openai_params: ["user"] so you can track spent directly in openrouter dashboard the request fails with BadRequestError

  - model_name: gpt-5
    litellm_params:
      model: openrouter/openai/gpt-5-chat
      api_key: os.environ/OPENROUTER_API_KEY
      allowed_openai_params: ["user"]

I think it already worked before.

Relevant log output

litellm.BadRequestError: OpenrouterException - {"error":{"message":"Expected string, received null","code":400}

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.75.8

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions