Skip to content

Align mlx::core::min op nan propagation with NumPy #2346

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 10, 2025

Conversation

jhavukainen
Copy link
Contributor

Proposed changes

Follow-up to #2339, extending the same treatment to the min op so NaNs get propagated.

Adds the similar NaN returns to non-integral op types and the specialized treatment for complex types. Extends the reduce nanpropagation tests to also test the min case. Adds the benchmarks for min.

Checklist

Put an x in the boxes that apply.

  • [ x ] I have read the CONTRIBUTING document
  • [ x ] I have run pre-commit run --all-files to format my code / installed pre-commit prior to committing changes
  • [ x ] I have added tests that prove my fix is effective or that my feature works
  • [ x ] I have updated the necessary documentation (if needed)

Copy link
Member

@awni awni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, thanks!

@awni
Copy link
Member

awni commented Jul 10, 2025

The test failure isn't from this diff. We have a flaky quantization test that NaN propagation exposes. But we'll need to fix that in a separate PR.

@awni awni merged commit 8c7bc30 into ml-explore:main Jul 10, 2025
5 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants