Skip to content

[fix] type error of quantile #3667

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jul 4, 2025
Merged

[fix] type error of quantile #3667

merged 2 commits into from
Jul 4, 2025

Conversation

gitabtion
Copy link
Contributor

@gitabtion gitabtion commented Jun 30, 2025

What does this PR do?

Entropies is a bfloat16 tensor when training with bf16, but quantile() input tensor must be either float or double dtype.

entropy_threshold = torch.quantile(entropies.flatten(), self.token_entropy_percentile_threshold)

Fixes #3666

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a GitHub issue? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@kashif
Copy link
Collaborator

kashif commented Jun 30, 2025

thanks @gitabtion are you using torch<=2.6?

@gitabtion
Copy link
Contributor Author

@kashif Yes, "torch==2.6.0+cu124".

@kashif kashif merged commit e8abe03 into huggingface:main Jul 4, 2025
marcandrelarochelle pushed a commit to marcandrelarochelle/trl that referenced this pull request Jul 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

quantile() input type error when using bf16
2 participants