BiRefNet icon indicating copy to clipboard operation
BiRefNet copied to clipboard

Feature: Onnx Export

Open schirrmacher opened this issue 1 year ago • 8 comments

Do you plan to build an Onnx export? I like deploying models with Onnx to reduce dependencies and use the model in C++.

schirrmacher avatar Aug 04 '24 09:08 schirrmacher

Hi, Marvin, some others wanted it (issue-1, issue-2). But I'm new to deployment-related things and tried some easy tutorials on it, which failed though. So, this was suspended...

I'll give it another try when I have enough free time.

ZhengPeng7 avatar Aug 04 '24 14:08 ZhengPeng7

Watching, onnx in c++ would be amazing!

antithing avatar Aug 17 '24 15:08 antithing

Hi, you can find the ONNX files of all weights files in the GitHub release or the ONNX folder in the stuff folder on my GDrive. For more info, plz check the ONNX conversion part in the model zoo section. You can find the inference codes in notebooks there.

Tell me if you encounter any further problems when using them :)

ZhengPeng7 avatar Aug 18 '24 18:08 ZhengPeng7

Thanks! Any tips on running with c++ instead of python? Also, do I need to adjust any workflow to use on video files?

antithing avatar Aug 18 '24 18:08 antithing

Sorry, I'm not familiar with the inference in C++. About the video, it's easy to make the prediction on each frame and combine them into a video.

ZhengPeng7 avatar Aug 19 '24 03:08 ZhengPeng7

Amazing! Thanks for the incredibly good support! ONNX helps a lot deploying the model in various contexts without requiring the Python data structures...

schirrmacher avatar Aug 20 '24 08:08 schirrmacher

@antithing I will publish the code here, I will integrate the model to my private C++ project. Will update you.

schirrmacher avatar Aug 20 '24 08:08 schirrmacher

Great work, @schirrmacher! You are welcome. I'm glad to do something if it can help.

ZhengPeng7 avatar Aug 20 '24 10:08 ZhengPeng7

@schirrmacher did you manage to run inference in c++?

antithing avatar Oct 16 '24 13:10 antithing