x

Results 10 comments of x

ping的地址是:baidu.com

I had the same problem too. It seems that the saving operation is done, which was checked by reopening the file without saving successfully. However, it is still annoying because...

I had the same problem. The inference time for batch size of 32 is about 32X larger than that for batch size of 1. But the same model using TensorFlow-TensorRT...

``In TensorRT, it is in the log output. I take "GPU Compute Time" as the inference time. > [07/11/2024-09:44:35] [I] === Performance summary === [07/11/2024-09:44:35] [I] Throughput: 15.4965 qps [07/11/2024-09:44:35]...

I reproduce the problem with an open model from [here](https://github.com/onnx/models/blob/main/validated/vision/classification/resnet/model/resnet50-v2-7.onnx). Here is the result. The scale of time is about 1.7 with double batch size. Is that normal? I believe...

@lix19937 Can you tell me something to try? Or common on the results please.

I have tried these commands. ``` trtexec --onnx=./resnet50-v2-7.onnx --saveEngine=./tmp.trt --minShapes=data:8x3x224x224 --optShapes=data:8x3x224x224 --maxShapes=data:8x3x224x224 > log8.txt trtexec --onnx=./resnet50-v2-7.onnx --saveEngine=./tmp.trt --minShapes=data:16x3x224x224 --optShapes=data:16x3x224x224 --maxShapes=data:16x3x224x224 > log16.txt trtexec --onnx=./resnet50-v2-7.onnx --saveEngine=./tmp.trt --minShapes=data:4x3x224x224 --optShapes=data:8x3x224x224 --maxShapes=data:16x3x224x224 > log8.txt...

The onnx file can be obtained from https://github.com/onnx/models/blob/main/validated/vision/classification/resnet/model/resnet50-v2-7.onnx

Please check the output of your command carefully. It is clear and easy to read.