-
-
Notifications
You must be signed in to change notification settings - Fork 17.2k
Description
Search before asking
- I have searched the YOLOv5 issues and found no similar bug report.
YOLOv5 Component
Validation, Detection, Export
Bug
Hi,
1) I have a problem when run detect.py with tflite model.
I export correctly my model with the command:
python export.py --weights yolov5s.pt --include tflite
But when i run both benchmakrs.py and detect.py fail to do inference (and val) with yolov5s-fp16.lite
python detect.py --weights yolov5n-fp16.tflite --source /usr/src/yolov5/myfile.jpg
I tried to use GPU and CPU but seems a tensor's problem, this is my error:
**INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Model summary: 1 layers, 0 parameters, 0 gradients
totale shape torch.Size([1, 100])
Traceback (most recent call last):
File "detect.py", line 282, in
main(opt)
File "detect.py", line 277, in main
run(**vars(opt))
*File "/opt/conda/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(args, kwargs)
File "detect.py", line 147, in run
pred = non_max_suppression(pred, conf_thres, iou_thres, classes, agnostic_nms, max_det=max_det)
File "/usr/src/yolov5/utils/general.py", line 825, in non_max_suppression
nc = prediction.shape[2] - nm - 5 # number of classes
IndexError: tuple index out of range
The shape of tensor called pred is [1,100] using tflite model, but is [1, 17640, 85] using pt model with the same options for both.
I tried to set nc = 10 (class number of VisDrone) and it run correctly but fail detection.
If i export my model using:
python export.py --weights yolov5n.pt --dynamic --include tflite
when run the detect.py i obtain this totale shape torch.Size([1, 400, 3, 85]) with the same error
2) The last question: Can i use the GPU to inference tflite model with tensorfow, why can delegate only for CPU?
Thanks all for your help!
Environment
- OS Ubuntu 20.04
- Docker version 20.10.12
- tensorflow 2.10.0
- torch 1.12.0+cu116