Skip to content

Confusion Matrix Missing False Negatives #8729

@jbutle55

Description

@jbutle55

Search before asking

  • I have searched the YOLOv5 issues and found no similar bug report.

YOLOv5 Component

Validation

Bug

In val.py, during the "Evaluate" stage, a single batch worth of metrics for the confusion matrix is computed using:

                if plots:
                    confusion_matrix.process_batch(predn, labelsn)

However, if the length of predictions for the image in question is zero, this portion of code is skipped over due to:

            if len(pred) == 0:
                if nl:
                    stats.append((torch.zeros(0, niou, dtype=torch.bool), torch.Tensor(), torch.Tensor(), tcls))
                continue

If this continue statement is called then this batch is not processed for the confusion matrix, but if the relevant image had ground truth objects, meaning these were missed detections since len(pred) was 0, then these FNs won't be accounted for in the confusion matrix.

Environment

No response

Minimal Reproducible Example

No response

Additional

No response

Are you willing to submit a PR?

  • Yes I'd like to help by submitting a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions