goldentimecoolk

Results 22 comments of goldentimecoolk

Yea, I agree. It would be better to find out the logs. I'm looking forward to it. -:)

Hi, thanks for your quick action. I trained with origional code and default setting in the `.sh` file. But only get 79% top1. In the paper, this seems to be...

> @wuwusky youxiu, Have you compared your re-implementation with the original paper on CUB datasets? youxiu is triple six! what's your implementation result of cub200?

I have the same problem, getting ~52% accuracy in bird test dataset. I guess data_transforms might be one tricky, because parameters including mean, std, eigval, eigvec are calculated in ImageNet...

I used their pretrained model and got 80%+ accuracy in cub200 just now. As for transform, do you mean params they provide based on Imagenet dont need to be modified?

I have 4 TITAN Xp and change the batch size to 16. But when I use `PreActResNet101`, there is still CUDA out of memory. When I merely change this model...

I got it. I noticed that I cropped the size as 448\*448, and that was the reason. When I changed it as 32\*32, there was little GPU memory used. But...

> I got it. I noticed that I cropped the size as 448*448, and that was the reason. When I changed it as 32*32, there was little GPU memory used....

Because images in cifar10 are all 32*32.