Enhancing LiteLLM Support in LobeChat #7236
aur3l14no
started this conversation in
General | 讨论
Replies: 1 comment
-
Really good idea! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi team! 👋
Following up on @arvinxx's previous comment about prioritizing LiteLLM as first-class citizen in LobeChat, I wanted to propose a wishlist to strengthen this integration.
Feature Wishlist
/model/info
endpointConsider adding LiteLLM as a separate provider rather than relying solely on OpenAI compatibility patches (reference). This could:
Thanks a lot for building this amazing app. I'm excited to see where this integration could go!
Beta Was this translation helpful? Give feedback.
All reactions