PCL icon indicating copy to clipboard operation
PCL copied to clipboard

Question about eq(9) in your paper.

Open randydkx opened this issue 3 years ago • 12 comments

Hi, thanks for your paper and code. I have a question about eq(9) in your paper, it seems that this eq is p(ci|xi), not p(xi|ci). I think p(xi|ci) includes only a single Gaussian distribution so that the integration on xi equals 1. Can you explain it for me? image

randydkx avatar Apr 05 '22 08:04 randydkx

I have the same question. I think this part is wrong in the paper since this is actually not a correct probability form.

DongXiang-CV avatar Apr 30 '22 02:04 DongXiang-CV

@LiJunnan1992 Could you explain this? I am very confused about this.

Thank you very much!

DongXiang-CV avatar Apr 30 '22 20:04 DongXiang-CV

Each x_i is an independent sample, so you do not need to marginalize over x_i.

LiJunnan1992 avatar May 01 '22 01:05 LiJunnan1992

Thank you very much for your reply! However, in probability theory, only the random variable has PDF (probability density function), a independent sample can not has.

DongXiang-CV avatar May 01 '22 19:05 DongXiang-CV

@DongXiang-CV It's ok to have unnormalized probabilities, but what confused me is that I think eq9 is p(ci|xi), not p(xi|ci), with prior probabilities fixed for every ci i.e. p(ci=k)=1/C for k =1,...,C. Could you explain it for me? @LiJunnan1992

randydkx avatar May 02 '22 00:05 randydkx

@randydkx what do you mean it's ok to have unnormalized probabilities? If you look at the Gaussian mixture model, you will find this part in the paper has some differences. The reason why the authors did the inverse likelihood is that they also want to learn representation not just do the clustering, but this inverse likelihood results in this unnormalized probabilities which is not very reasonable compared to Gaussian mixture model.

DongXiang-CV avatar May 02 '22 14:05 DongXiang-CV

@DongXiang-CV It's ok to have unnormalized probabilities, but what confused me is that I think eq9 is p(ci|xi), not p(xi|ci), with prior probabilities fixed for every ci i.e. p(ci=k)=1/C for k =1,...,C. Could you explain it for me? @LiJunnan1992

I agree with you and I am also confused that Eq. 9 is p(c|x), not p(x|c).

Hzzone avatar May 02 '22 16:05 Hzzone

@randydkx what do you mean it's ok to have unnormalized probabilities? If you look at the Gaussian mixture model, you will find this part in the paper has some differences. The reason why the authors did the inverse likelihood is that they also want to learn representation not just do the clustering, but this inverse likelihood results in this unnormalized probabilities which is not very reasonable compared to Gaussian mixture model.

In other words, the p(x|c) should be the common Gaussian distribution not the categorical distribution and the posterior p(c|x) is the categorical distribution over cluster c. The current p(x|c) is the paper is categorical distribution over c which is not reasonable since it is not on the X space, p(x|c) should be the simple Gaussian distribution.

DongXiang-CV avatar May 02 '22 16:05 DongXiang-CV

@DongXiang-CV It's ok to have unnormalized probabilities, but what confused me is that I think eq9 is p(ci|xi), not p(xi|ci), with prior probabilities fixed for every ci i.e. p(ci=k)=1/C for k =1,...,C. Could you explain it for me? @LiJunnan1992

I agree with you and I am also confused that Eq. 9 is p(c|x), not p(x|c).

I am also confused with this. I think our questions are essentially the same.

DongXiang-CV avatar May 02 '22 16:05 DongXiang-CV

Thanks for your work! It seems that only with Eq.(9) we can get the loss term in Eq.(10). I wonder whether the assumption in Eq.(9) appeared in other related literature before? @LiJunnan1992

wwangwitsel avatar May 06 '22 03:05 wwangwitsel

@wwangwitsel I agree with you, without this eq.(9), the whole mathematical model is not valid, and I look forward to the author's answer @LiJunnan1992

ZJZhang123 avatar Jul 06 '22 09:07 ZJZhang123

您好,感谢您的论文和代码。我对您论文中的eq(9)有疑问,看来这个eq是p(ci|xi),而不是p(xi|ci)。我认为p(xi| ) ci) 只包含单个高斯分布,因此 xi 上的积分等于 1。您晕了我解释一下吗? 图像

Have you known that?I‘m also so confused about it.Can you help me?

Yun-Fu avatar Sep 24 '23 11:09 Yun-Fu