Replies: 8 comments
-
I should add that: import onnx
onnx_model = onnx.load("acai.onnx") # Load the ONNX model
onnx.checker.check_model(onnx_model) # Check that the model is well formed
print("ONNX model is valid.") runs fine, indicating that onnx considers the model as valid. |
Beta Was this translation helpful? Give feedback.
-
Ok so looking at the onnx graph, I see that the inputs of the model are:
Could it be that |
Beta Was this translation helpful? Give feedback.
-
Also running it with the import numpy as np
import onnxruntime as ort
# Create an ONNX Runtime session
ort_session = ort.InferenceSession("acai.onnx")
# Prepare input data for the model
input_images = np.random.random((1, 63, 63, 3)).astype(np.float32)
input_metadata = np.random.random((1, 25)).astype(np.float32)
# run the model
ort_inputs = {
ort_session.get_inputs()[0].name: input_metadata, # The first input is the metadata
ort_session.get_inputs()[1].name: input_images # The second input is the images
}
ort_output = ort_session.run(
output_names=[ort_session.get_outputs()[0].name],
input_feed=ort_inputs
)
print(f"ONNX Runtime Output: {ort_output}") |
Beta Was this translation helpful? Give feedback.
-
Hello, can you give me access to the model somehow ? |
Beta Was this translation helpful? Give feedback.
-
Hey @kali! Sure thing, I linked the h5 above but let me send the onnx version. Here it is: PS: Sorry I had to add another extension at the end for it to upload, you might have to rename it |
Beta Was this translation helpful? Give feedback.
-
BTW I should mention that since I opened that issue, I got it to work with pykeio's ort crate. Validating that the issue doesn't come from the model being invalid or something like that. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Moving this to a discussion as it is a recurring question. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I've been trying to convert this model to onnx using the code from
examples/keras-tract-tf2
(with a few edits to match the input of my model).While converting to onnx doesn't yield any errors, and nor does loading it with
tract_model = tract.onnx().model_for_path("acai.onnx")
,tract_model.into_optimized()
(both in rust and python) yields the following error:This model is multimodal, it has a CCN branch and a very basic MLP at the other branch. After flattening the CNN, the results from both branches of the model are concatenated and following by a couple of simple layers leading to a single neuron as the output (to get a score from 0 to 1).
I'm not entirely sure why concatenating 2 1D tensors is failing here. This model, in tensorflow, is valid and runs just fine. So either the conversion to onnx is incorrect, or tract itself is having an issue loading it and optimizing it?
Beta Was this translation helpful? Give feedback.
All reactions