DCLGAN icon indicating copy to clipboard operation
DCLGAN copied to clipboard

About the time to end training

Open ZhenyuLiu-SYSU opened this issue 3 years ago • 3 comments

Hello! Thank you so much for doing such a good job. I would like to know that when do I end the training while training for the best results? And is there any basis for judgment?

Thank you!

ZhenyuLiu-SYSU avatar Apr 23 '22 01:04 ZhenyuLiu-SYSU

Hello Zhenyu, Many thanks for your kind words! For best results, you may record the FID score every epoch. But better FID score does not always suggests better translation. Hence I would suggest referring to FID, KID, LPIPS for a comprehensive evaluation.

Usually I would suggest training for 1M iterations, for instance , to train on a dataset contraining 5000 images, 200 epochs should be fine.

JunlinHan avatar Apr 23 '22 04:04 JunlinHan

Thanks for your reply! And I try to train the code on the winter2summer dataset and found that DCL's FID >SimDCL's FID. This doesn't seem consistent with the fact that DCL mentioned in your paper works better than SimDCL Do you think this is normal?

Thanks!

ZhenyuLiu-SYSU avatar Apr 24 '22 04:04 ZhenyuLiu-SYSU

Thanks for your reply! And I try to train the code on the winter2summer dataset and found that DCL's FID >SimDCL's FID. This doesn't seem consistent with the fact that DCL mentioned in your paper works better than SimDCL Do you think this is normal?

Thanks!

Oh I guess that's OK. Datasets may have some bias, I won't be surprised if the difference is not very significant (let's say, 100 for SimDCL but 130+ for DCL).

JunlinHan avatar Apr 24 '22 15:04 JunlinHan