group_normalization
group_normalization copied to clipboard
GPU memory problem with Group Normalization
Hi! Thanks for your great work. I built a U-net model with pytorch. During my test, when I used BN(Batch Normalization), it cost about 1.8G GPU memory. But when I used GN(Group Normalization), it cost about 2.2G. When the ipunts are bigger, the difference of the GPU memory will also become bigger. I can`t figure out why the GN cost so much, Could you give me some advices about it?
I also met this problem.