Skip to content

fix(deps): update dependency huggingface-hub to ~=0.34.0 #1747

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 25, 2025

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Jul 25, 2025

This PR contains the following updates:

Package Change Age Confidence
huggingface-hub ~=0.33.0 -> ~=0.34.0 age confidence

Release Notes

huggingface/huggingface_hub (huggingface-hub)

v0.34.0: [v0.34.0] Announcing Jobs: a new way to run compute on Hugging Face!

🔥🔥🔥 Announcing Jobs: a new way to run compute on Hugging Face!

We're thrilled to introduce a powerful new command-line interface for running and managing compute jobs on Hugging Face infrastructure! With the new hf jobs command, you can now seamlessly launch, monitor, and manage jobs using a familiar Docker-like experience. Run any command in Docker images (from Docker Hub, Hugging Face Spaces, or your own custom images) on a variety of hardware including CPUs, GPUs, and TPUs - all with simple, intuitive commands.

Key features:

  • 🐳 Docker-like CLI: Familiar commands (run, ps, logs, inspect, cancel) to run and manage jobs
  • 🔥 Any Hardware: Instantly access CPUs, T4/A10G/A100 GPUs, and TPUs with a simple flag
  • 📦 Run Anything: Use Docker images, HF Spaces, or custom containers
  • 📊 Live Monitoring: Stream logs in real-time, just like running locally
  • 💰 Pay-as-you-go: Only pay for the seconds you use
  • 🧬 UV Runner: Run Python scripts with inline dependencies using uv (experimental)

All features are available both from Python (run_job, list_jobs, etc.) and the CLI (hf jobs).

Example usage:

### Run a Python script on the cloud
hf jobs run python:3.12 python -c "print('Hello from the cloud!')"

### Use a GPU
hf jobs run --flavor=t4-small --namespace=huggingface ubuntu nvidia-smi

### List your jobs
hf jobs ps

### Stream logs from a job
hf jobs logs <job-id>

### Inspect job details
hf jobs inspect <job-id>

### Cancel a running job
hf jobs cancel <job-id>

### Run a UV script (experimental)
hf jobs uv run my_script.py --flavor=a10g-small --with=trl

You can also pass environment variables and secrets, select hardware flavors, run jobs in organizations, and use the experimental uv runner for Python scripts with inline dependencies.

Check out the Jobs guide for more examples and details.

🚀 The CLI is now hf! (formerly huggingface-cli)

Glad to announce a long awaited quality-of-life improvement: the Hugging Face CLI has been officially renamed from huggingface-cli to hf! The legacy huggingface-cli remains available without any breaking change, but is officially deprecated. We took the opportunity update the syntax to a more modern command format hf <resource> <action> [options] (e.g. hf auth login, hf repo create, hf jobs run).

Run hf --help to know more about the CLI options.

✗ hf --help
usage: hf <command> [<args>]

positional arguments:
  {auth,cache,download,jobs,repo,repo-files,upload,upload-large-folder,env,version,lfs-enable-largefiles,lfs-multipart-upload}
                        hf command helpers
    auth                Manage authentication (login, logout, etc.).
    cache               Manage local cache directory.
    download            Download files from the Hub
    jobs                Run and manage Jobs on the Hub.
    repo                Manage repos on the Hub.
    repo-files          Manage files in a repo on the Hub.
    upload              Upload a file or a folder to the Hub. Recommended for single-commit uploads.
    upload-large-folder
                        Upload a large folder to the Hub. Recommended for resumable uploads.
    env                 Print information about the environment.
    version             Print information about the hf version.

options:
  -h, --help            show this help message and exit

⚡ Inference

🖼️ Image-to-image

Added support for image-to-image task in the InferenceClient for Replicate and fal.ai providers, allowing quick image generation using FLUX.1-Kontext-dev:

from huggingface_hub import InferenceClient

client = InferenceClient(provider="fal-ai")
client = InferenceClient(provider="replicate")

with open("cat.png", "rb") as image_file:
   input_image = image_file.read()

### output is a PIL.Image object
image = client.image_to_image(
    input_image,
    prompt="Turn the cat into a tiger.",
    model="black-forest-labs/FLUX.1-Kontext-dev",
)

In addition to this, it is now possible to directly pass a PIL.Image as input to the InferenceClient.

🤖 Tiny-Agents

tiny-agents got a nice update to deal with environment variables and secrets. We've also changed its input format to follow more closely the config format from VSCode. Here is an up to date config to run Github MCP Server with a token:

{
  "model": "Qwen/Qwen2.5-72B-Instruct",
  "provider": "nebius",
  "inputs": [
    {
      "type": "promptString",
      "id": "github-personal-access-token",
      "description": "Github Personal Access Token (read-only)",
      "password": true
    }
  ],
  "servers": [
    {
     "type": "stdio",
     "command": "docker",
     "args": [
       "run",
       "-i",
       "--rm",
       "-e",
       "GITHUB_PERSONAL_ACCESS_TOKEN",
       "-e",
       "GITHUB_TOOLSETS=repos,issues,pull_requests",
       "ghcr.io/github/github-mcp-server"
     ],
     "env": {
       "GITHUB_PERSONAL_ACCESS_TOKEN": "${input:github-personal-access-token}"
     }
    }
  ]
}
🐛 Bug fixes

InferenceClient and tiny-agents got a few quality of life improvements and bug fixes:

📤 Xet

Integration of Xet is now stable and production-ready. A majority of file transfer are now handled using this protocol on new repos. A few improvements have been shipped to ease developer experience during uploads:

Documentation has already been written to explain better the protocol and its options:

🛠️ Small fixes and maintenance

🐛 Bug and typo fixes
🏗️ internal

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Never, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

Signed-off-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Copy link
Member

@rhatdan rhatdan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@rhatdan rhatdan merged commit 65ceb73 into main Jul 25, 2025
19 of 61 checks passed
@renovate renovate bot deleted the renovate/huggingface-hub-0.x branch July 25, 2025 10:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant