Jordain
Jordain
I am also experience similar issues. Using WSL. Wondering if the reason it's slow is because of this warning? WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch...
I find the quality doesn't improve much at 16, 8, 4. I find that the best batch size to use is 4, 2, 1. Takes about 5mins to generate. Here...
Yeah I have the same issue. If I switch from fixed to randomize and generate it still keeps the same image. If I switch from randomize to fixed it will...
You might not have enough VRAM to get it to run. Even with 24GB of VRAM it was a lot. Some people were able to get it to run with...
Running into the same error.
Thanks @Wraithnaut will try that