controlled-text-generation icon indicating copy to clipboard operation
controlled-text-generation copied to clipboard

KL divergence

Open rainyrainyguo opened this issue 7 years ago • 1 comments

can you give some explanations of the KL divergence term? I am a little bit confused kl_loss = torch.mean(0.5 * torch.sum(torch.exp(logvar) + mu**2 - 1 - logvar, 1)) Thank you so much!

rainyrainyguo avatar Aug 28 '18 02:08 rainyrainyguo

It is the KL divergence of two Gaussian distribution (i.e., the prior p(z) ~ N(0, 1) and the posterior q(z|h) ~ N(mu, var). See the original paper https://arxiv.org/pdf/1312.6114.pdf

JianLiu91 avatar Nov 21 '18 01:11 JianLiu91