pytorch_coma
pytorch_coma copied to clipboard
relu at the end of encoder
After training, I noticed that all latent codes are not less than 0, which might be caused by F.relu at the end encoder. I think it's not neccessary and harmful for variational CoMA. So I replace it with leakyReLU for mean and Sigmoid for std, which works.