-
-
Notifications
You must be signed in to change notification settings - Fork 17.3k
Description
Search before asking
- I have searched the YOLOv5 issues and found no similar bug report.
YOLOv5 Component
AutoShape, DetectMultiBackend
Bug
In PR #9363, the following line was added to the constructor of the AutoShape wrapper:
Line 603 in 23701ea
| m.export = True # do not output loss values |
i.e. if the model which AutoShape wraps is a PyTorch model, then the export class attribute of the Detect layer is set to True.
Why was this added and is this correct?
This now changes the behavior of the Detect layer depending on whether we use the AutoShape wrapper or not. The forward pass of the Detect layer returns tuple (output, input) if export=False and output if export=True. Thus, depending on whether we use AutoShape or DetectMultiBackend, the Detect layer (and thus the model) either outputs 1 or 2 values, which IMO it should not, the API of the Detect layer should not depend on the wrapper (which the Detect layer is not aware of). See the example below.
Environment
- YOLO: latest master
- OS: Ubuntu 20.04
- Python: 3.8.10
Minimal Reproducible Example
import torch
model_autoshape = torch.hub.load(
"ultralytics/yolov5",
model="yolov5s",
pretrained=True,
force_reload=True,
autoshape=True,
)
model_dmb = torch.hub.load(
"ultralytics/yolov5",
model="yolov5s",
pretrained=True,
force_reload=True,
autoshape=False,
)
images = torch.zeros((2, 3, 640, 640)).to(model_dmb.device)
output_autoshape = model_autoshape(images)
output_dmb = model_dmb(images)
print("AutoShape: ", type(output_autoshape), output_autoshape.shape)
print("DetectMultiBackend: ", type(output_dmb), [type(o) for o in output_dmb])
This outputs:
AutoShape: <class 'torch.Tensor'> torch.Size([2, 25200, 85])
DetectMultiBackend: <class 'list'> [<class 'torch.Tensor'>, <class 'list'>]
i.e. AutoShape returns the output of the detect layer only (the bounding box proposals), while DetectMultiBackend returns a tuple consisting of the Detect layer output and input.
Additional
No response
Are you willing to submit a PR?
- Yes I'd like to help by submitting a PR!