Skip to content

Conversation

ishaan-jaff
Copy link
Contributor

@ishaan-jaff ishaan-jaff commented Nov 30, 2023

Pull Request Template

Summary

Adding docs on how to use the LiteLLM Proxy (doc: https://docs.litellm.ai/docs/simple_proxy) with Librechat, use the LiteLLM Proxy for:

  • Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc in the OpenAI ChatCompletions & Completions format
  • Load balancing - between Multiple Models + Deployments of the same model LiteLLM proxy can handle 1k+ requests/second during load tests
  • Authentication & Spend Tracking

Please provide a brief summary of your changes and the related issue. Include any motivation and context that is relevant to your changes. If there are any dependencies necessary for your changes, please list them here.

Change Type

Please delete any irrelevant options.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update
  • Documentation update

Testing

Please describe your test process and include instructions so that we can reproduce your test. If there are any important variables for your testing configuration, list them here.

Test Configuration:

Checklist

  • My code adheres to this project's style guidelines
  • I have performed a self-review of my own code
  • I have commented in any complex areas of my code
  • I have made pertinent documentation changes
  • My changes do not introduce new warnings
  • I have written tests demonstrating that my changes are effective or that my feature works
  • Local unit tests pass with my changes
  • Any changes dependent on mine have been merged and published in downstream modules.

@ishaan-jaff
Copy link
Contributor Author

@danny-avila we've had a couple users trying to use LibreChat + LiteLLM - this PR is to make it easier to use them together

We've added this on the main page for LiteLLM Proxy too:
https://docs.litellm.ai/docs/simple_proxy#using-with-openai-compatible-projects

Screenshot 2023-11-30 at 10 39 39 AM

Can I get a review when possible ?

@danny-avila danny-avila changed the title [docs] Add LiteLLM Proxy - Load balance 100+ LLMs & Spend Tracking 📚 docs: Add LiteLLM Proxy - Load balance 100+ LLMs & Spend Tracking ⚖️🤖📈 Nov 30, 2023
@danny-avila
Copy link
Owner

Can I get a review when possible ?

I added a bit on why to use it, thanks so much!

@danny-avila danny-avila merged commit 2bcfb04 into danny-avila:main Nov 30, 2023
cnkang pushed a commit to cnkang/LibreChat that referenced this pull request Feb 6, 2024
…️🤖📈 (danny-avila#1249)

* (docs) add instructions on using litellm

* Update litellm.md

---------

Co-authored-by: Danny Avila <[email protected]>
BertKiv pushed a commit to BertKiv/LibreChat that referenced this pull request Dec 10, 2024
…️🤖📈 (danny-avila#1249)

* (docs) add instructions on using litellm

* Update litellm.md

---------

Co-authored-by: Danny Avila <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants