Muawiz Umar

Results 3 comments of Muawiz Umar

use following flags while launching webUI --disable-xformers --opt-sdp-attention.

I appreciate your amazing work! for me torch-nightly takes 9-18 sec per inference on first 3 warm-up inferences and torch takes 1-1.5 minites per inference on first 3 inferences am...

I have tested this slowdown on h100 and rtx4090. The slowdown is around 1 minute for just torch and for torch nightly its around 3-7 seconds