tensorrt inference?
I converted pt to onnx, and to trt. But outputs infer are 2 tensors have shape (1,32,160,160) , (1,32,8400). I couldn't post process to get mask. Please help me.
I handled it. You can reference at link: [GitHub] (https://github.com/ChuRuaNh0/FastSam_Awsome_TensorRT). If it helpful with you, give me a star, thanks.
@ChuRuaNh0 thank you, you saved ton of time
@ChuRuaNh0 thank you! For better performance, add an "INMSLayer" at the end of network is a better choice!
I handled it. You can reference at link: [GitHub] (https://github.com/ChuRuaNh0/FastSam_Awsome_TensorRT). If it helpful with you, give me a star, thanks.
Thanks for your contribution! We will add FastSAM_Awesome_TensorRT link to README.md, so that more people can benefit from your contribution.
@AAAAAAAyq I'm appreciate. It's great to hear that.
I handled it. You can reference at link: [GitHub] (https://github.com/ChuRuaNh0/FastSam_Awsome_TensorRT). If it helpful with you, give me a star, thanks.
Great Work! I want to know the inference speed and your device when you use the ONNX or TRT. Looking forward to your reply