ONNX-Runtime-Inference icon indicating copy to clipboard operation
ONNX-Runtime-Inference copied to clipboard

ONNX Runtime Inference C++ Example

Results 7 ONNX-Runtime-Inference issues
Sort by recently updated
recently updated
newest added

can i use .ort model to your python demo? this demo can load .ort model? and how can i use this demo at .ort? https://github.com/Lucifer192192/testort/blob/main/apple.zip

I tried to build the docker image using [onnxruntime-cuda.Dockerfile](https://github.com/leimao/ONNX-Runtime-Inference/blob/main/docker/onnxruntime-cuda.Dockerfile), however, maybe for network reasons, I cannot build successfully automatically or mannually. So can you share the built images through google...

Heyo, I have a trained model converted to onxx from pytorch for image segmentation. I can make inference with Python onnx and it is working fine. I can see the...

Hello, my question is, what if your input is not an image but for example more than one sensor data. how do perform inference wen you already have your onnx...

Hi, Thanks for your code, it works perfectly. I have used your code to run on windows visual studio. I was wondering whether it is possible to separate the code...