Skip to content

Conversation

qgallouedec
Copy link
Member

@qgallouedec qgallouedec commented Apr 8, 2025

This PR huggingface/transformers#37173 removes GenerationMixin inheritance by default in PreTrainedModel. Consequently, AutoModel now returns object without method prepare_inputs_for_generation.

This line:

self.prepare_inputs_for_generation = pretrained_model.prepare_inputs_for_generation

causes a premature error in this test test_modeling_value_head.py::Seq2SeqValueHeadModelTester::test_raise_error_not_causallm

def test_raise_error_not_causallm(self):
# Test with a model without a LM head
model_id = "trl-internal-testing/tiny-T5ForConditionalGeneration"
# This should raise a ValueError
with self.assertRaises(ValueError):
pretrained_model = AutoModel.from_pretrained(model_id)
_ = self.trl_model_class.from_pretrained(pretrained_model)

which is bound to fail anyway, because that's what we're testing.

I propose to remove this test, which in fact tests a part of the codebase that is very little used, with a model that is also very little used.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@shirinyamani shirinyamani left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see the error is not related to this PR, apparently bc of the release

@qgallouedec qgallouedec merged commit 227df82 into main Apr 9, 2025
10 checks passed
@qgallouedec qgallouedec deleted the fix-ci branch April 9, 2025 17:39
yxliu-TAMU pushed a commit to mincheolseong/ECEN743-GRPO-Project-Proposal that referenced this pull request Apr 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants