tutorials icon indicating copy to clipboard operation
tutorials copied to clipboard

change onnx model default dtype

Open sezarxray opened this issue 5 years ago • 1 comments

dear sir, I tried to convert my model tensorflow->onnx->tensorrt but tensorrt said your input has type uint8 (no support) is there a way to change model input from uint8 to int32 in onnx model? regards.

sezarxray avatar Apr 23 '20 05:04 sezarxray

it depends on the scenario. If this is a quantized model then you should get the fp32 model for this quantized model and use the fp32 model for conversion... if uint8 is an input to a data type agnostic node such as transpose or identity then you can simply use a cast node

askhade avatar Aug 04 '20 04:08 askhade