MDMM
MDMM copied to clipboard
normalization absence
Hello! I found a following thing in LeakyReLUConv2d:
class LeakyReLUConv2d(nn.Module):
def __init__(self, ..., norm='None', ...):
....
if 'norm' == 'Instance':
model += [nn.InstanceNorm2d(n_out, affine=False)]
...
https://github.com/HsinYingLee/MDMM/blob/master/networks.py#L362
It seems that normalization is never applied in LeakyReLUConv2d block.
Does it affect the model performance, as LeakyReLUConv2d present in MultiDomain Encoder and Discriminators?
Are the best results reported in paper are gained with turned on InstanceNormalization?
Best Regards, Aleksei Silvestrov