train with multi-gpus
@lllyasviel @williamyang1991 @eltociear @camenduru
hello bro,
Is it possible to train the tutorial_train_sd21.py with multi-GPUs?
I have modified the
trainer = pl.Trainer(gpus=1, precision=32, callbacks=[logger]) to
trainer = pl.Trainer(gpus=4, accelerator='dp,precision=32, callbacks=[logger])
but it doesn't work
Can someone help me?
What was the error?
I get the same problem
You'll need to add the distributed strategy to avoid multiple gpus accessing the same files. I used the following arguments and it works for me:
trainer = pl.Trainer(strategy="ddp", accelerator="gpu", devices=2, precision=32, callbacks=[logger])
solve my problem, thx!
You'll need to add the distributed strategy to avoid multiple gpus accessing the same files. I used the following arguments and it works for me:
trainer = pl.Trainer(strategy="ddp", accelerator="gpu", devices=2, precision=32, callbacks=[logger])