parameter may be not registered
https://github.com/bguisard/SuperResolution/blob/7c88843170e015d954b0891d9d2854372753a70b/model.py#L118 https://github.com/bguisard/SuperResolution/blob/7c88843170e015d954b0891d9d2854372753a70b/model.py#L120 https://github.com/bguisard/SuperResolution/blob/7c88843170e015d954b0891d9d2854372753a70b/model.py#L123
I dont know PyTorch of old version. However, code above may be wrong with PyTorch 1.6+, because the parameters could not be registered. The forward propagation can run, but parameter may be not updated on the backward propagation. Thus the loss will be oscillating and not decrease.
It should be like this: https://github.com/DrRyanHuang/SuperResolution/blob/42b230d5877d5c8f0d5e07dda8ad63edcf5e2a70/model.py#L102
if upblock:
# Loop for residual blocks
self.rs = nn.ModuleList([ResidualBlock(64, device=device) for i in range(4)])
# Loop for upsampling
self.up = nn.ModuleList([UpsampleBlock(64, device=device) for i in range(2)])
else:
# Loop for residual blocks
self.rs = nn.ModuleList([ResidualBlock(64, device=device) for i in range(4)])