lama
lama copied to clipboard
Fix tuple error, mentioned in some of the issues
Summary
- We see some GitHub issues when some folks were trying to use the refinement step for non-fourier models. So we add a fix for that.
- (minor) We also correct a typo in the website link to Geomagical.com
Issues
https://github.com/advimman/lama/issues/274 https://github.com/advimman/lama/issues/167
Problem
In the lama-regular model, the latent feature is a single Tensor, while in lama-fourier and big-lama, the latent feature is a tuple of two tensors. In our refinement code, we assumed the feature to be a tuple.
Solution
This PR adds a function that adapts the feature appropriately. We can't just keep it as-it-is because when Pytorch optimizers (like Adam) don't take a tuple as input.
Visual Results
-
Input Images
-
Big LaMa (before and after refinement)
-
LaMa-Fourier (before and after refinement, of course, parameters tuned to this model could make results even better)
-
LaMa-Regular (the one on which refinement was failing)