Skip to content

[Bug]: Wrong endpoint selected if modelspec options are used #5213

@xhejtman

Description

@xhejtman

What happened?

If I configure model spec such as:

modelSpecs:
      enforce: false
      prioritize: false
      list:
       - name: llama3.3:latest
         label: "Llama 3.3"
         default: true
         iconURL: https://cdn1.cloud.e-infra.cz/logos/meta-llama.png
         showIconInMenu: true
         preset:
           endpoint: "custom"
           model: "llama3.3:latest"
           maxContextTokens: 128000
           max_tokens: 16384

and do not set ENDPOINTS variable (which seems to be optional), and define endpoints in the librechat.yaml:

endpoints:
      agents:
        recursionLimit: 50
        disableBuilder: false
        capabilities: ["file_search", "actions", "tools"]
      custom:
      - name: "Ollama"
        apiKey: "ollama"
        baseURL: "http://ollama:11434/v1/"
        models:
          default: [
            "llama3.3:latest",
            "qwq",
            "qwen2.5-coder"
            ]
          fetch: false # fetching list of models is not supported
        titleConvo: true
        titleModel: "current_model"

the UI "pre-select" Agent endpoint and it cannot be changed to the Ollama endpoint.

It can be fixed setting the ENDPOINTS to "custom,agents".

Cannot be changed = you can click at ollama, but it retuns back to Agents.

Steps to Reproduce

Configure librechat as above.

What browsers are you seeing the problem on?

Chrome

Relevant log output

No logs.

Screenshots

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐛 bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions