flowtron
flowtron copied to clipboard
Training on a smaller GPU?
Hi, is there anyway to train this on a smaller GPU setup for us mere mortals? On a 8G 2070 with a batch size set to 1, it still seems to run out of memory.
Thanks for any help.
I have to add this to the training loop to get it to fit on my 24G card (dataset dependant):
if txt.size(1) <= 1 or txt.size(1) > 550:
continue
This essentially skips data with long utterances.
It is less likely to face this issue when training from scratch, but also you can remove any utterance that is longer than 6 seconds.