Skip to content

Fix Weekly Test Failure #8

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 27, 2024
Merged

Fix Weekly Test Failure #8

merged 2 commits into from
Jun 27, 2024

Conversation

Satrat
Copy link
Contributor

@Satrat Satrat commented Jun 25, 2024

Original failure was from sparseml: https://github.com/neuralmagic/sparseml/actions/runs/9636050064/job/26573555739

Issue was that when doing sparsity -> quantization, the quantization modifier needs to be included before SparseGPT. After restructuring the recipe the test passes locally for the 7b model. Also updating the device argument to be "auto" for all models

@Satrat Satrat requested a review from bfineran June 25, 2024 19:39
@Satrat Satrat requested a review from dsikka June 25, 2024 19:57
Copy link
Collaborator

@dsikka dsikka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we make sure auto works with the intended runner before we make this change?

@Satrat
Copy link
Contributor Author

Satrat commented Jun 25, 2024

Could we make sure auto works with the intended runner before we make this change?

Sure, how would I test that? Right now I'm just setting export CADENCE="weekly" then running pytest on a kubernetes pod

@Satrat Satrat merged commit 9384edf into main Jun 27, 2024
7 of 12 checks passed
@Satrat Satrat deleted the sa/fix_weekly_test branch June 27, 2024 14:16
markmc pushed a commit to markmc/llm-compressor that referenced this pull request Nov 13, 2024
gradients shouldn't be computed for Q/DQ in QAT
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants