dla icon indicating copy to clipboard operation
dla copied to clipboard

fix unused parameters

Open siyuanliii opened this issue 4 years ago • 2 comments

Minor fix of some unused parameters which will cause problems in DDP training.

siyuanliii avatar Jan 04 '22 12:01 siyuanliii

Wow, would have loved to see your commit a week earlier. Thanks! When I compared my rebuilt with the original implementation via netron and named parameters, I now also noticed the unnecessary projections on stages with a depth of one. Now the only thing missing is removing unnecessary max pooling operations and adjusting the loading of the pre-trained network.

janthmueller avatar May 08 '23 08:05 janthmueller

Hi guys, I have solved this problem. The bug is on this (part)[https://github.com/ucbdrive/dla/blob/master/dla.py#L206].

    def forward(self, x, residual=None, children=None):
        children = [] if children is None else children
        bottom = self.downsample(x) if self.downsample else x
        residual = self.project(bottom) if self.project else bottom
        if self.level_root:
            children.append(bottom)
        x1 = self.tree1(x, residual)
        if self.levels == 1:
            x2 = self.tree2(x1)
            x = self.root(x2, x1, *children)
        else:
            children.append(x1)
            x = self.tree2(x1, children=children)
        return x

residual=None is pass from self.tree1(x, residual) from the outlayer, but not used. It is useful when x1=self.tree1(x, residual) is the last inner layer, that a project layer. we can fix this bug at (here)[https://github.com/ucbdrive/dla/blob/master/dla.py#L199]

if in_channels != out_channels and (levels in [1, ]):

Dawn-bin avatar Apr 22 '24 07:04 Dawn-bin