SimpleTuner
SimpleTuner copied to clipboard
Train the lora flux with 2 x 4090
Hello, would 2 x Nvidia 4090 be enough to train the flux model or would there be a need for vram while training? Can we allocate the required memory size over 2 GPUs?