generative-models icon indicating copy to clipboard operation
generative-models copied to clipboard

Lack of MPS support.

Open Vargol opened this issue 2 years ago • 3 comments

Hi any channce you could update sgm/modules/diffusionmodules/model.py to use other Memory efficient attention function and not hard code XFormers. From my brief play on Colab the one built into to PyTorch 2.0 upwards used by Diffusers is as good if not better than XFormers. anyway and compatible with MPS.

Also can you update your requirements to use a newer tokenizers as tokenizers==0.12.1 is pretty much the only version the requires building from source on Apple Silicon computers (no wheel for arm64) .

It would be nice you can stop assuming everyone is on CUDA too, so we don't have to change every bit of example code to get rid of the hard coded 'cuda' devices. :-)

Vargol avatar Nov 22 '23 11:11 Vargol

Totally agree !!! :)

yanomano avatar Nov 25 '23 15:11 yanomano

I did successfully run this repo's code on MPS back in the day (see https://github.com/Stability-AI/generative-models/pull/79 and my other PRs that haven't been accepted).

Those might help you hack some support in.

akx avatar Dec 04 '23 06:12 akx

do we have a working M1/M2/M3 branch ?

scottonly2 avatar May 25 '24 06:05 scottonly2