ResNet icon indicating copy to clipboard operation
ResNet copied to clipboard

Why retrain at about 95 epoch with data aug disabled

Open DeppMeng opened this issue 8 years ago • 3 comments

Hi, I am trying to reproduce the resnet imagenet result on mxnet.

I noticed that you have successfully reproduced the resnet results from facebook, but you mentioned that

when epoch is about 95, cancel the scale/color/aspect augmentation during training, this can be done by only comment out 6 lines of the code, like this:

I didn't quite understand why we need this procedure to get a similar results because I as far as I know Torch/PyTorch don't need this procedure. Is that means if we don't apply this procedure, the result will be suffered (lower than the paper claimed)?

Thanks

DeppMeng avatar Jan 30 '18 07:01 DeppMeng

I have the same question, any clus till now?

Hi, I am trying to reproduce the resnet imagenet result on mxnet.

I noticed that you have successfully reproduced the resnet results from facebook, but you mentioned that

when epoch is about 95, cancel the scale/color/aspect augmentation during training, this can be done by only comment out 6 lines of the code, like this:

I didn't quite understand why we need this procedure to get a similar results because I as far as I know Torch/PyTorch don't need this procedure. Is that means if we don't apply this procedure, the result will be suffered (lower than the paper claimed)?

Thanks

I have the same question, any clus till now?

jamesdeep avatar Mar 06 '19 09:03 jamesdeep

@DeppMeng I think data augmentation may change the overall distribution of the dataset. When trained with small learning rate, the network may learn such distorted distribution.

By the way, do you have the training log / script on Pytorch? Thanks :)

ky-du avatar Dec 10 '19 11:12 ky-du

Sorry I am not doing research in this field now:)

DeppMeng avatar Dec 11 '19 07:12 DeppMeng