-
Notifications
You must be signed in to change notification settings - Fork 32
Open
Description
I couldn't figure out how to do it with LlmTornado - I want to add multiple OpenAI-compatible endpoint providers into the same object.
It seems to me like it won't work, since only one EndpointProvider is supported for each LlmProvider:
https://github.com/lofcz/LlmTornado/blob/master/src/LlmTornado/TornadoApi.cs#L152-L153
It would also need to be handled with regard to specifying the ChatModel - how to disambiguate which EndpointProvider each model goes to, when they all have the same LlmProvider.
It might just be a non-starter and I should use multiple TorandoApi objects, but I've encountered more than once the need to support both OpenAI, and other servers serving using an OpenAI-compatible endpoint.
What do you think?
Metadata
Metadata
Assignees
Labels
No labels