Feature: Onnx Export
Do you plan to build an Onnx export? I like deploying models with Onnx to reduce dependencies and use the model in C++.
Hi, Marvin, some others wanted it (issue-1, issue-2). But I'm new to deployment-related things and tried some easy tutorials on it, which failed though. So, this was suspended...
I'll give it another try when I have enough free time.
Watching, onnx in c++ would be amazing!
Hi, you can find the ONNX files of all weights files in the GitHub release or the ONNX folder in the stuff folder on my GDrive. For more info, plz check the ONNX conversion part in the model zoo section. You can find the inference codes in notebooks there.
Tell me if you encounter any further problems when using them :)
Thanks! Any tips on running with c++ instead of python? Also, do I need to adjust any workflow to use on video files?
Sorry, I'm not familiar with the inference in C++. About the video, it's easy to make the prediction on each frame and combine them into a video.
Amazing! Thanks for the incredibly good support! ONNX helps a lot deploying the model in various contexts without requiring the Python data structures...
@antithing I will publish the code here, I will integrate the model to my private C++ project. Will update you.
Great work, @schirrmacher! You are welcome. I'm glad to do something if it can help.
@schirrmacher did you manage to run inference in c++?