ONNX-Runtime-Inference
ONNX-Runtime-Inference copied to clipboard
ONNX Runtime Inference C++ Example
can i use .ort model to your python demo? this demo can load .ort model? and how can i use this demo at .ort? https://github.com/Lucifer192192/testort/blob/main/apple.zip
when i run inference.cpp, i get this error. Please guide me how to solve this.
I tried to build the docker image using [onnxruntime-cuda.Dockerfile](https://github.com/leimao/ONNX-Runtime-Inference/blob/main/docker/onnxruntime-cuda.Dockerfile), however, maybe for network reasons, I cannot build successfully automatically or mannually. So can you share the built images through google...
Heyo, I have a trained model converted to onxx from pytorch for image segmentation. I can make inference with Python onnx and it is working fine. I can see the...
Hello, my question is, what if your input is not an image but for example more than one sensor data. how do perform inference wen you already have your onnx...
Hi, Thanks for your code, it works perfectly. I have used your code to run on windows visual studio. I was wondering whether it is possible to separate the code...