DeepRobust icon indicating copy to clipboard operation
DeepRobust copied to clipboard

About total_loss in prognn

Open songshuhan opened this issue 4 years ago • 5 comments

total_loss = loss_fro
+ args.gamma * loss_gcn
+ args.alpha * loss_l1
+ args.beta * loss_nuclear
+ args.phi * loss_symmetric

In prognn, total_loss should be "loss_diffiential
+ args.alpha * loss_l1
+ args.beta * loss_nuclear " ?

songshuhan avatar Apr 11 '22 07:04 songshuhan

Hi songshuhan,

args.phi is usually set to 0; for gamma you can refer to Eq (9) of the paper.

ChandlerBang avatar Apr 14 '22 23:04 ChandlerBang

Thanks for your reply!But I mean why "args.lambda_ * loss_smooth_feat" does not appear in total_loss

songshuhan avatar Apr 15 '22 09:04 songshuhan

Hi,

total_loss is not used in backward propagation. The variable loss_differential in line 176 is the one used in backward propagation.

ChandlerBang avatar Apr 15 '22 12:04 ChandlerBang

Thanks for your reply!Can I have your wechat? I want to consult some questions

songshuhan avatar Apr 25 '22 06:04 songshuhan

Hi, sorry for the late reply. You can email me at [email protected] and I will email you the wechat number.

ChandlerBang avatar Jul 20 '22 03:07 ChandlerBang