SnowflakeNet
SnowflakeNet copied to clipboard
About training duration
hi, I wonder how do you define your training strategy (e.g. batchsize in pcn case and hardware infos) and how much epochs does it take to converge? Looking forward to your reply!
Hi, we use 4 NVDIA GTX 2080TI GPUs for training, the batch size (64 at least) is set to occupy all the GPU space, and it takes about 300-400 epochs to converge.
Copy that! It would be a good reference.