Zhe Zhou

Results 12 comments of Zhe Zhou

PReLU keeps original values when inputs are positive, and multiplies a scale factor (the factors are trained previously ) when inputs are negative, so you can use ReLU( 1*x )...

> Hi @PKUZHOU, > Can you provide the script of transforming PReLU into the combination of ReLu, Scaling, and Elemwise-Add? > I think it will be very helpful to generalize...

Sorry, I have no idea about this error in Int8 mode. However, from my point of view, adopting Int8 quantization may not bring much overall acceleration, since the network inference...

不需要重新训练,但是需要写个脚本转换一下权重

it was trained on prelu, in fact the model prototxt and weight are borrowed from the official repo of MTCNN. If you replace with relu+scale, you should also modify the...

Q1: TensorRT doesn't support dynamic input size, so the image size should be configured previously. Q2: There is something wrong with the destructor

Have you found a solution? I am also faced with this problem...

By using metasim, I have found that the TracerVBridge has broken. The TracerV's channel always has out_ready = 0 so that the channel gets stuck at cycle 2: In the...

Hi @davidbiancolin, I have tested both 1.15.1 and the main branch, they have the same problem. By the way, I have also tested the micro2022_tutorial branch but it also got...