Jordi Malé
Jordi Malé
Hi! Indeed I tried reducing the input size of the image, but the GPU consumed during inference is almost the same. I tried with 480x640, as well as 672x896. Im...
Hi! I tried multiple configurations and memory is still very high, above 12GB despite making the images smaller and changing this padding. Furthermore, I'm not planning to use the code...
Have you tried quantization in any of the trained models? Or conversion to TensorRT/ONNXRT? If not, do you have any suggestions on how to proceed?