tutorials
tutorials copied to clipboard
change onnx model default dtype
dear sir, I tried to convert my model tensorflow->onnx->tensorrt but tensorrt said your input has type uint8 (no support) is there a way to change model input from uint8 to int32 in onnx model? regards.
it depends on the scenario. If this is a quantized model then you should get the fp32 model for this quantized model and use the fp32 model for conversion... if uint8 is an input to a data type agnostic node such as transpose or identity then you can simply use a cast node