We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent e468cc6 commit 7856680Copy full SHA for 7856680
docs/install/configuration/litellm.md
@@ -6,6 +6,7 @@ weight: -7
6
7
# Using LibreChat with LiteLLM Proxy
8
Use **[LiteLLM Proxy](https://docs.litellm.ai/docs/simple_proxy)** for:
9
+
10
* Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc. in the OpenAI ChatCompletions & Completions format
11
* Load balancing - between Multiple Models + Deployments of the same model LiteLLM proxy can handle 1k+ requests/second during load tests
12
* Authentication & Spend Tracking Virtual Keys
0 commit comments