SadTalker icon indicating copy to clipboard operation
SadTalker copied to clipboard

add support to macOS GPU for inference

Open huiofficial opened this issue 2 years ago • 3 comments

Use torch.backends.mps.is_available() to check whether GPU is available for PyTorch on macOS.

Use to.(xx.device) to match op device for computing.

Use os.environ['PYTORCH_ENABLE_MPS_FALLBACK'] = '1' to deal with operator 'aten::grid_sampler_3d' is not currently implemented for the MPS device.

And I optimized imports in inference.py

huiofficial avatar Apr 29 '24 14:04 huiofficial

How fast is it?

yukiarimo avatar May 16 '24 05:05 yukiarimo

How fast is it?

I tested it in my M1 and got 2x faster

bruno-cunha avatar May 17 '24 05:05 bruno-cunha

thanks, it'd be better if u can give a review.

huiofficial avatar May 17 '24 15:05 huiofficial