LightGCN-PyTorch icon indicating copy to clipboard operation
LightGCN-PyTorch copied to clipboard

BPR loss sigmoid vs softplus

Open YuZhang10 opened this issue 3 years ago • 2 comments

Hello, thanks for your implementation in the first place. I notice that in BPR loss you used softplus instand of sigmiod, which is different from the original paper. image Could you please explain why do so? Thx~

YuZhang10 avatar Mar 28 '23 15:03 YuZhang10

They are same! If we use sigmoid, BPRloss = -logσ(pos-neg) = -log(1 / (1+e^-(pos - neg))). If we use softplus, BPRloss = softplus(neg - pos) = log(1+e^(neg - pos)) = -log(1 / (1+e^-(pos - neg))). they are equal. sigmoid(x) = 1 / (1 + e^(-x)); softplus(x) = log(1 + e^x) .

zbt78 avatar Apr 04 '23 02:04 zbt78

They are same! If we use sigmoid, BPRloss = -logσ(pos-neg) = -log(1 / (1+e^-(pos - neg))). If we use softplus, BPRloss = softplus(neg - pos) = log(1+e^(neg - pos)) = -log(1 / (1+e^-(pos - neg))). they are equal. sigmoid(x) = 1 / (1 + e^(-x)); softplus(x) = log(1 + e^x) .

zbt78 avatar Apr 04 '23 02:04 zbt78