Implementation-MolGAN-PyTorch icon indicating copy to clipboard operation
Implementation-MolGAN-PyTorch copied to clipboard

Why is this codepiece inside a for-loop?

Open UmarZein opened this issue 1 year ago • 1 comments

https://github.com/kfzyqin/Implementation-MolGAN-PyTorch/blob/d07a490c841df7644011fe87c796978427fa828b/layers.py#L44 i.e.,

for i in range(len(self.units)):
      in_units = list([x + in_features for x in self.units])

I don't see how it is any different from

in_units = list([x + in_features for x in self.units])

I it isn't a big deal, performance-wise, but if I'm missing something, please do tell

UmarZein avatar Nov 07 '24 13:11 UmarZein

https://github.com/kfzyqin/Implementation-MolGAN-PyTorch/blob/d07a490c841df7644011fe87c796978427fa828b/layers.py#L45

I'm sorry for nitpicking again, but if this part is supposed to reflect itertools.pairwise(self.units), there is a clearer way of conveying it, and for code generally around this part:

class MultiGraphConvolutionLayers(Module):
    def __init__(self, in_features, units, activation, edge_type_num, with_features=False, f=0, dropout_rate=0.):
        super(MultiGraphConvolutionLayers, self).__init__()
        self.conv_nets = nn.ModuleList()
        self.units = units
        in_units = []
        if with_features:
            for i in range(len(self.units)):
                in_units = list([x + in_features for x in self.units])
            for u0, u1 in zip([in_features+f] + in_units[:-1], self.units):
                self.conv_nets.append(GraphConvolutionLayer(u0, u1, activation, edge_type_num, dropout_rate))
        else:
            for i in range(len(self.units)):
                in_units = list([x + in_features for x in self.units])
            for u0, u1 in zip([in_features] + in_units[:-1], self.units):
                self.conv_nets.append(GraphConvolutionLayer(u0, u1, activation, edge_type_num, dropout_rate))

into:

class MultiGraphConvolutionLayers(Module):
    def __init__(self, in_features, units, activation, edge_type_num, with_features=False, f=0, dropout_rate=0.):
        super(MultiGraphConvolutionLayers, self).__init__()
        assert(with_features or f==0) # with_features=false implies f=0, so this is to enforce that
        self.conv_nets = nn.ModuleList()
        for u0, u1 in zip([in_features+f] + units[:-1], units):
            self.conv_nets.append(GraphConvolutionLayer(u0, u1, activation, edge_type_num, dropout_rate))

this isn't a big deal, just wanted to put this out there

UmarZein avatar Nov 07 '24 14:11 UmarZein