k-diffusion icon indicating copy to clipboard operation
k-diffusion copied to clipboard

Scaled cosine similarity can cause NaNs

Open bobcao3 opened this issue 1 year ago • 0 comments

https://github.com/crowsonkb/k-diffusion/blob/21d12c91ad4550e8fcf3308ff9fe7116b3f19a08/k_diffusion/models/image_transformer_v2.py#L111

I know it's mentioned in the paper that this version of scaling directly parametrize scale instead of exponent, however an unintended side effects is that when the scale goes close to zero it can get into negatives due to some larger random gradient updates, which causes a NaN.

Fix is simple, in our adaptation for our in house models we changed it to torch.sqrt(torch.abs(scale) + eps). The eps is added for preserving gradients (so it never reaches zero). I guess a biased ReLU also probably works, along with other non linear functions.

bobcao3 avatar Aug 23 '24 21:08 bobcao3