About total_loss in prognn
total_loss = loss_fro
+ args.gamma * loss_gcn
+ args.alpha * loss_l1
+ args.beta * loss_nuclear
+ args.phi * loss_symmetric
In prognn, total_loss should be
"loss_diffiential
+ args.alpha * loss_l1
+ args.beta * loss_nuclear " ?
Hi songshuhan,
args.phi is usually set to 0; for gamma you can refer to Eq (9) of the paper.
Thanks for your reply!But I mean why "args.lambda_ * loss_smooth_feat" does not appear in total_loss
Hi,
total_loss is not used in backward propagation. The variable loss_differential in line 176 is the one used in backward propagation.
Thanks for your reply!Can I have your wechat? I want to consult some questions
Hi, sorry for the late reply. You can email me at [email protected] and I will email you the wechat number.