LoGoNet icon indicating copy to clipboard operation
LoGoNet copied to clipboard

Feature Dynamic Aggregation Module

Open hbsycym opened this issue 2 years ago • 3 comments

hello,I have a question that where the code for Feature Dynamic Aggregation Module is ? I have seen the code in LoGoHead_kitti.py

pooled_features = pooled_features + localgrid_densityfeat_fuse.permute(0, 2, 1) and attention_output = pooled_features + attention_output

if this code means the three feature F p B + F l B + F g B, then I didn't see the self attetion moudle before

shared_features = self.shared_fc_layer(pooled_features.view(batch_size_rcnn, -1, 1))

hbsycym avatar Dec 12 '23 09:12 hbsycym

Please refer to line557 in LoGoHead_kitti.py and line134 in attention_utils.py.

sankin97 avatar Dec 12 '23 10:12 sankin97

Please refer to line557 in LoGoHead_kitti.py and line134 in attention_utils.py.

thanks ! and one more question, the input_sp_tensor of spconv.SparseConvTensor in backbone3d, its spatial shape is [41, 1600, 1408] , is it mean how many voxels on Z, Y, X axis respectively ?

hbsycym avatar Dec 14 '23 08:12 hbsycym

你好,我有个问题,特征动态聚合模块的代码在哪里? 我在 LoGoHead_kitti.py 中看到了代码

pooled_features = pooled_features + localgrid_densityfeat_fuse.permute(0, 2, 1) 和tention_output = pooled_features +tention_output

如果此代码表示三个特征 F p B + F l B + F g B,那么我之前没有看到自我注意模块

shared_features = self.shared_fc_layer(pooled_features.view(batch_size_rcnn, -1, 1))

你好,我没看懂作者的意思,现在我在代码里没有对应找到LoF/GoF/FDA三个模块的代码,attention_output = self.attention_head(pooled_features, positional_input, src_key_padding_mask) 这句代码的描述的是GoF还是FDA,期待你的回答,谢谢!

1806610292 avatar Jul 22 '24 16:07 1806610292