pytorch_coma icon indicating copy to clipboard operation
pytorch_coma copied to clipboard

relu at the end of encoder

Open Bemfool opened this issue 3 years ago • 0 comments

After training, I noticed that all latent codes are not less than 0, which might be caused by F.relu at the end encoder. I think it's not neccessary and harmful for variational CoMA. So I replace it with leakyReLU for mean and Sigmoid for std, which works.

Bemfool avatar Dec 02 '22 06:12 Bemfool