Skip to content

Commit 950a85d

Browse files
authored
TensorRT PyTorch Hub inference fix (#7560)
Solution proposed in #7128 to TRT PyTorch Hub CUDA illegal memory errors.
1 parent c16671f commit 950a85d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

models/common.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -531,7 +531,7 @@ def forward(self, imgs, size=640, augment=False, profile=False):
531531
# multiple: = [Image.open('image1.jpg'), Image.open('image2.jpg'), ...] # list of images
532532

533533
t = [time_sync()]
534-
p = next(self.model.parameters()) if self.pt else torch.zeros(1) # for device and type
534+
p = next(self.model.parameters()) if self.pt else torch.zeros(1, device=self.model.device) # for device, type
535535
autocast = self.amp and (p.device.type != 'cpu') # Automatic Mixed Precision (AMP) inference
536536
if isinstance(imgs, torch.Tensor): # torch
537537
with amp.autocast(autocast):

0 commit comments

Comments
 (0)