Skip to content

Unpin to support transformers==4.52.3 #1479

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
May 29, 2025
Merged

Conversation

kylesayrs
Copy link
Collaborator

@kylesayrs kylesayrs commented May 27, 2025

Purpose

  • Support the latest transformers release

Prerequisites

Fixes

Changes

  • Unpin transformers version
  • Add torchvision, librosa, and soundfile to dev dependencies (needed to test models)
  • Fix default ignore list for tracing debugger
  • Add back llama4 model tests

@kylesayrs kylesayrs added the ready When a PR is ready for review label May 27, 2025
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@kylesayrs kylesayrs changed the title Unpin Transformers Add testing deps, unpin transformers May 28, 2025
@kylesayrs kylesayrs force-pushed the kylesayrs/unpin-transformers branch from 83cd7aa to 4285c9f Compare May 28, 2025 02:27
@kylesayrs kylesayrs marked this pull request as ready for review May 28, 2025 02:27
Signed-off-by: Kyle Sayers <[email protected]>
@kylesayrs kylesayrs changed the base branch from main to kylesayrs/autowrapper May 28, 2025 03:12
@kylesayrs kylesayrs changed the base branch from kylesayrs/autowrapper to main May 28, 2025 03:12
@rahul-tuli
Copy link
Collaborator

Could you stack this PR on top of #1411 ? It'll be easier to review

@kylesayrs
Copy link
Collaborator Author

@rahul-tuli This was pointed to main so CI would run

@kylesayrs kylesayrs removed the ready When a PR is ready for review label May 29, 2025
kylesayrs added 2 commits May 29, 2025 13:34
Signed-off-by: Kyle Sayers <[email protected]>
Signed-off-by: Kyle Sayers <[email protected]>
@kylesayrs kylesayrs changed the title Add testing deps, unpin transformers Unpin to support transformers==4.52.3 May 29, 2025
@kylesayrs kylesayrs added the ready When a PR is ready for review label May 29, 2025
markurtz
markurtz previously approved these changes May 29, 2025
Copy link
Collaborator

@brian-dellabetta brian-dellabetta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one nit

@brian-dellabetta
Copy link
Collaborator

This closes #1457

Signed-off-by: Kyle Sayers <[email protected]>
@kylesayrs
Copy link
Collaborator Author

@brian-dellabetta Sounds good, I've reverted the breakout

Copy link
Collaborator

@brian-dellabetta brian-dellabetta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks great, thanks!

Copy link
Collaborator

@shanjiaz shanjiaz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! Looks good to me

@kylesayrs kylesayrs enabled auto-merge (squash) May 29, 2025 21:08
@kylesayrs kylesayrs merged commit 4888f34 into main May 29, 2025
11 checks passed
@kylesayrs kylesayrs deleted the kylesayrs/unpin-transformers branch May 29, 2025 21:50
kylesayrs added a commit that referenced this pull request Jun 3, 2025
## Purpose ##
* Add support for mistral3
* Related: #1343

## Prerequisites ##
* #1479

## Changes ##
* Added mistral3 example
* This model does not automatically change the dtype of pixel_values to
match the dtype of the model, so I had to do so manually in the collator
and sample generation
* This model has a [very verbose chat template by
default](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/blob/main/chat_template.json),
which may be less conducive to calibration, so I added a custom
shortened version

## Testing ##
* Ran example to completion:
[nm-testing/Mistral-Small-3.1-24B-Instruct-2503-W4A16-G128](https://huggingface.co/nm-testing/Mistral-Small-3.1-24B-Instruct-2503-W4A16-G128)

---------

Signed-off-by: Kyle Sayers <[email protected]>
aireilly pushed a commit to aireilly/llm-compressor that referenced this pull request Jul 30, 2025
## Purpose ##
* Support the latest transformers release

## Prerequisites ##
* vllm-project#1481
* vllm-project#1411

## Fixes ##
* vllm-project#1457

## Changes ##
* Unpin transformers version
* Add `torchvision`, `librosa`, and `soundfile` to dev dependencies
(needed to test models)
* Fix default ignore list for tracing debugger
* Add back llama4 model tests

---------

Signed-off-by: Kyle Sayers <[email protected]>
aireilly pushed a commit to aireilly/llm-compressor that referenced this pull request Jul 30, 2025
## Purpose ##
* Add support for mistral3
* Related: vllm-project#1343

## Prerequisites ##
* vllm-project#1479

## Changes ##
* Added mistral3 example
* This model does not automatically change the dtype of pixel_values to
match the dtype of the model, so I had to do so manually in the collator
and sample generation
* This model has a [very verbose chat template by
default](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/blob/main/chat_template.json),
which may be less conducive to calibration, so I added a custom
shortened version

## Testing ##
* Ran example to completion:
[nm-testing/Mistral-Small-3.1-24B-Instruct-2503-W4A16-G128](https://huggingface.co/nm-testing/Mistral-Small-3.1-24B-Instruct-2503-W4A16-G128)

---------

Signed-off-by: Kyle Sayers <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants