DiffBIR icon indicating copy to clipboard operation
DiffBIR copied to clipboard

Multi GPU Support?

Open AIisCool opened this issue 2 years ago • 2 comments

Loving the results, but maxing out GPU 24GB VRAM. Can this be run with mutli GPUs, where it will continue on the second GPU so it doesn't run out of VRAM? Any changes I have to make to this:

python inference.py \
--input inputs/general \
--config configs/model/cldm.yaml \
--ckpt weights/general_full_v1.ckpt \
--reload_swinir --swinir_ckpt weights/general_swinir_v1.ckpt \
--steps 50 \
--sr_scale 4 \
--image_size 512 \
--color_fix_type wavelet --resize_back \
--output results/general

?

AIisCool avatar Sep 09 '23 21:09 AIisCool

Hey, we've updated for tiled inference to avoid OOM. You can try this new feature to solve the problem.

ziyannchen avatar Nov 08 '23 08:11 ziyannchen

Hey, we've updated for tiled inference to avoid OOM. You can try this new feature to solve the problem.

Hi, Can 24G memory train this model?

Chantec avatar May 22 '24 12:05 Chantec