Skip to content

Supporting multiple OpenAI-compatible providers in one Api object #62

@shaltielshmid

Description

@shaltielshmid

I couldn't figure out how to do it with LlmTornado - I want to add multiple OpenAI-compatible endpoint providers into the same object.

It seems to me like it won't work, since only one EndpointProvider is supported for each LlmProvider:

https://github.com/lofcz/LlmTornado/blob/master/src/LlmTornado/TornadoApi.cs#L152-L153

It would also need to be handled with regard to specifying the ChatModel - how to disambiguate which EndpointProvider each model goes to, when they all have the same LlmProvider.

It might just be a non-starter and I should use multiple TorandoApi objects, but I've encountered more than once the need to support both OpenAI, and other servers serving using an OpenAI-compatible endpoint.

What do you think?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions