How to do Inference profiling of the model in C++ (Android)? #22742
Unanswered
martinkorelic
asked this question in
API Q&A
Replies: 1 comment
-
Setting SessionOptions It was enabled via a genai config setting, which results in this call: Does profiling work with other models? e.g. something from https://onnx.ai/models/ |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have an implemented model in C++ along with inference generation, however when I set to enable profiling on InferenceSession, do a single inference pass and then try to end profiling, I get no output json in the defined profile path. The returned profile path string of SessionEndProfiling "\0". I have tried to use also EndProfilingAllocated, tried exiting the process or just releasing the InferenceSession, however no file was created or saved.
It should be noted that I am doing this on Android and using EnableProfiling with InferenceSession, which is created successfully as it returns a null pointer.
There is a similar issue to this one, however it was not answered.
If anyone has any successful experience of profiling on C++ using onnxruntime on mobile feel free to guide me in the right direction.
Beta Was this translation helpful? Give feedback.
All reactions