Something wrong with code in attention.py
Excuse me, sir.
I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x * Mf)'.
According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x * Mf' .
Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'.
Thanks,
I use 'return x * Mf', but the code has not been modified at this time.
I will update the new code as soon as possible.
Thank U, sir.
-----Original Message----- From: "harudaee"[email protected] To: "asdf2kr/BAM-CBAM-pytorch"[email protected]; Cc: "Subscribed"[email protected]; Sent: 2019-11-12 (화) 23:18:06 (GMT+09:00) Subject: [asdf2kr/BAM-CBAM-pytorch] Something wrong with code in attention.py (#1)
Excuse me, sir. I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x * Mf)'. According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x * Mf' . Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'. — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or unsubscribe.
Hello sir, I want to add this CBAM block in the YOLOv4 object detection model. Can you please give some idea how I can proceed with it, it will help me a lot. Thank you in advance !!
@Rvv1296 Please check the darknet repository for this issue
Excuse me, sir.
I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x * Mf)'.
According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x * Mf' .
Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'.
Why Mf = 1 + self.sigmoid(Mc * Ms) instead of Mf = 1 + self.sigmoid(Mc + Ms), and where can I find an introduction to it