-
Notifications
You must be signed in to change notification settings - Fork 235
fix(deps): update dependency huggingface-hub to ~=0.33.0 #1505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Reviewer's GuideThis PR updates the development dependency huggingface_hub from ~0.32.4 to ~0.33.0 in the project configuration, enabling the latest inference provider integrations, documentation updates, and bug fixes provided by the upstream release. Sequence Diagram: Using Featherless.AI via huggingface-hubsequenceDiagram
actor User as Developer
participant IC as InferenceClient
participant FAS as Featherless.AI Service
User->>+IC: InferenceClient(provider="featherless-ai")
User->>+IC: chat.completions.create(model="deepseek-ai/DeepSeek-R1-0528", messages=[{"role": "user", "content": "..."}])
IC->>+FAS: API Request (model, messages)
FAS-->>-IC: Completion Response
IC-->>-User: Completion Object
Sequence Diagram: Using Groq via huggingface-hubsequenceDiagram
actor User as Developer
participant IC as InferenceClient
participant GS as Groq Service
User->>+IC: InferenceClient(provider="groq")
User->>+IC: chat.completions.create(model="meta-llama/Llama-4-Scout-17B-16E-Instruct", messages=[{"role": "user", "content": "..."}])
IC->>+GS: API Request (model, messages)
GS-->>-IC: Completion Response
IC-->>-User: Completion Object
Class Diagram: huggingface_hub.InferenceClient and Associated Data ObjectsclassDiagram
class InferenceClient {
+__init__(provider: str)
+chat_completions_create(model: str, messages: list) : Completion
}
class Completion {
+choices: List~Choice~
}
class Choice {
+message: Message
}
class Message {
+role: str
+content: any
}
InferenceClient "1" ..> "1" Completion : creates
Completion "1" o-- "*" Choice : has
Choice "1" o-- "1" Message : has
File-Level Changes
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
We may want to consider removing this dependancy, it's more hassle then it's worth, we have our own huggingface client. |
I agree we should remove it. |
This PR contains the following updates:
~=0.32.4
->~=0.33.0
Release Notes
huggingface/huggingface_hub (huggingface-hub)
v0.33.0
: [v0.33.0]: Welcoming Featherless.AI and Groq as Inference Providers!Compare Source
⚡ New provider: Featherless.AI
Featherless AI is a serverless AI inference provider with unique model loading and GPU orchestration abilities that makes an exceptionally large catalog of models available for users. Providers often offer either a low cost of access to a limited set of models, or an unlimited range of models with users managing servers and the associated costs of operation. Featherless provides the best of both worlds offering unmatched model range and variety but with serverless pricing. Find the full list of supported models on the models page.
⚡ New provider: Groq
At the heart of Groq's technology is the Language Processing Unit (LPU™), a new type of end-to-end processing unit system that provides the fastest inference for computationally intensive applications with a sequential component, such as Large Language Models (LLMs). LPUs are designed to overcome the limitations of GPUs for inference, offering significantly lower latency and higher throughput. This makes them ideal for real-time AI applications.
Groq offers fast AI inference for openly-available models. They provide an API that allows developers to easily integrate these models into their applications. It offers an on-demand, pay-as-you-go model for accessing a wide range of openly-available LLMs.
🤖 MCP and Tiny-agents
It is now possible to run tiny-agents using a local server e.g. llama.cpp. 100% local agents are right behind the corner!
Fixing some DX issues in the
tiny-agents
CLI.tiny-agents
cli exit issues by @Wauplin in #3125📚 Documentation
New translation from the Hindi-speaking community, for the community!
🛠️ Small fixes and maintenance
😌 QoL improvements
🐛 Bug and typo fixes
🏗️ internal
Significant community contributions
The following contributors have made significant changes to the library over the last release:
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Never, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.
Summary by Sourcery
Chores: