FlowFormer-Official
FlowFormer-Official copied to clipboard
The training stage is so slow.
I am using 4xV100x32GB GPU to train with the "small_things_eval.py" config file, and using the "C+T+K+S" datasets strategy. The batch size is max set to 4, and I have also set the DataLoader with pin_memory=True and num_workers=16. However, I have noticed that the training process is extremely slow. Is there any way to speed up the training stage?
How long does it take to train 1 epoch?