Skip to content

Conversation

folkertdev
Copy link
Collaborator

@folkertdev folkertdev commented Apr 17, 2025

This makes three changes to the uncompress fuzzer

  • rename uncompress2 to uncompress (the old uncompress fuzzer was removed)
  • when gathering code coverage, keep invalid inputs in the corpus so that code coverage shows what error paths we cover.
  • feed the input in chunks of 64 bytes, to exercise more of the streaming logic

With those changes we get much better coverage of inflate.rs:

https://app.codecov.io/github/trifectatechfoundation/zlib-rs/commit/291a345601f0e817b1fb2a385e3df400e576662b/blob/zlib-rs/src/inflate.rs?flags%5B0%5D=fuzz-decompress&dropdown=coverage

cc @inahga

Copy link

codecov bot commented Apr 17, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Flag Coverage Δ
fuzz-compress 41.78% <ø> (ø)
fuzz-decompress 30.86% <ø> (+0.68%) ⬆️
test-aarch64-apple-darwin 89.57% <ø> (+0.10%) ⬆️
test-x86_64-apple-darwin 87.58% <ø> (-0.09%) ⬇️
test-x86_64-unknown-linux-gnu 90.40% <ø> (+0.33%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

see 5 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@folkertdev folkertdev force-pushed the fuzz-coverage-error-paths branch from bc07162 to 291a345 Compare April 17, 2025 15:05
@folkertdev folkertdev requested a review from bjorn3 April 17, 2025 15:18
@folkertdev folkertdev force-pushed the fuzz-coverage-error-paths branch from 291a345 to 893cc15 Compare April 21, 2025 19:31
data.len(),
)
// Small enough to hit interesting cases, but large enough to hit the fast path
let chunk_size = 64;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would varying the chunk size make sense? Especially non-power-of-two chunk sizes seem like they could be useful for finding edge-cases.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might, but I don't see how we could do that in a deterministic way and continue to use the compression corpus.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could take the chunk size from the fuzz input, right? So for example interpreting the fuzz input as series of 1 byte chunk size + n bytes chunk data.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would that not mess with how effective the corpus is?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could reformat the existing corpus into this format, right? In any case I guess this is better left as future improvement.

@folkertdev folkertdev merged commit 38b1d09 into main Apr 22, 2025
24 checks passed
@folkertdev folkertdev deleted the fuzz-coverage-error-paths branch April 22, 2025 10:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants