CodeFormer
CodeFormer copied to clipboard
Confilict in VQGAN code book loss
In the code, β times zq-z.detach(),
https://github.com/sczhou/CodeFormer/blob/e878192ee253cfcc8f19e29d3307c181501f53ae/basicsr/archs/vqgan_arch.py#L55
but i think it should times the commitment loss zq.detach()-z to control the learning of Encoder, referred to the CoderFormer and VQVAE papers.
Although the VQVAE paper pointed that
the results did not vary for values of β ranging from 0.1 to 2.0
β (in the code) is placed in the wrong place and set to be 0.25, leading to the weight of the commitment loss, the real β, being 4.0.
Is there a slight possibility that it will affect the performance of VQVAE?