Deep-Learning-Accelerator-SW icon indicating copy to clipboard operation
Deep-Learning-Accelerator-SW copied to clipboard

Accuracy of sigmoid layer's output drops a lot

Open Railcalibur opened this issue 1 year ago • 1 comments

platform: Jetson AGX Orin 64GB OS: 5.1.2 DLA: 3.12.1

Sigmoid layers are used as the output of the model and the input & output shape of sigmoid is (8, 3, 88, 160). I found the accuracy of fp16 dla model drops a lot when I use sigmoid as output layer. However, the outputs is consistent to torch outputs if the sigmoid is removed, with the cosine similarity close to 1.

I want to know what is the limitations on the use of sigmoid layers ? Why does this loss of precision occur ?

Railcalibur avatar Jun 03 '24 14:06 Railcalibur

@Railcalibur Is this issue still occurring with the latest JetPack 6.0? Also, does it occur with batch size 1, such that the shape is (1, 3, 88, 160). Thanks!

nvoliver avatar Jun 10 '24 21:06 nvoliver