-
Notifications
You must be signed in to change notification settings - Fork 714
Open
Labels
bugUnexpected behaviour that should be corrected (type)Unexpected behaviour that should be corrected (type)
Description
🐞Describing the bug
torch.mm fails at runtime in CoreML.
Stack Trace
/opt/miniconda3/envs/op-et/lib/python3.10/site-packages/coremltools/models/model.py:560: RuntimeWarning: You will not be able to run predict() on this Core ML model. Underlying exception message was: {
NSLocalizedDescription = "Failed to build the model execution plan using a model architecture file '/private/var/folders/lw/phxpy6k10ll388xs18hyq1cr0000gn/T/tmpcb8uw0e1.mlmodelc/model.mil' with error code: -14.";
}
_warnings.warn(
Traceback (most recent call last):
File "/Users/scroy/Desktop/executorch/test.py", line 159, in <module>
out = mlmodel.predict(predict_inputs)
File "/opt/miniconda3/envs/op-et/lib/python3.10/site-packages/coremltools/models/model.py", line 804, in predict
raise self._framework_error
File "/opt/miniconda3/envs/op-et/lib/python3.10/site-packages/coremltools/models/model.py", line 549, in _get_proxy_and_spec
_MLModelProxy(
RuntimeError: {
NSLocalizedDescription = "Failed to build the model execution plan using a model architecture file '/private/var/folders/lw/phxpy6k10ll388xs18hyq1cr0000gn/T/tmpcb8uw0e1.mlmodelc/model.mil' with error code: -14.";
To Reproduce
import torch
class Model(torch.nn.Module):
def __init__(self):
super().__init__()
self.weight = torch.randint(0, 100, (8, 8)).to(torch.int32)
def forward(self, x):
return torch.mm(x, self.weight)
model = Model()
inputs = (
torch.randn(8, 8).to(torch.int32),
)
eager_outputs = model(*inputs)
ep = torch.export.export(model, inputs)
print(ep)
import coremltools as ct
import numpy as np
ep = ep.run_decompositions({})
eager_outputs = model(*inputs)
mlmodel = ct.convert(ep)
coreml_inputs = mlmodel.get_spec().description.input
coreml_outputs = mlmodel.get_spec().description.output
predict_inputs = {str(ct_in.name): pt_in.detach().cpu().numpy().astype(np.int32) for ct_in, pt_in in zip(coreml_inputs, inputs)}
out = mlmodel.predict(predict_inputs)
print("Eager", eager_outputs)
print("CoremL", out)
System environment (please complete the following information):
- coremltools version: 8.3
- OS (e.g. MacOS version or Linux type): macOS15
Metadata
Metadata
Assignees
Labels
bugUnexpected behaviour that should be corrected (type)Unexpected behaviour that should be corrected (type)