Underscore

Results 4 comments of Underscore

> Thanks @itlackey. I guess it should be a few changed lines (for the transformers loader): > > 1. `model = model.to("xpu")` in `modules.models.huggingface_loader` > 2. `return input_ids.to(torch.device('xpu'))` in modules.text-generation.encode....

`print(f"generations: input_ids set! model class: {shared.model.__class__.__name__} | has xpu {hasattr(torch, 'xpu')}")` in text-generation/modules prints: ![image](https://github.com/oobabooga/text-generation-webui/assets/47636331/db730a1b-7cb8-4abc-86b4-6ef571aa88bc) (using a GGUF model, though I'm trying to get CBLAS set up right now though,...

@oobabooga ![image](https://github.com/oobabooga/text-generation-webui/assets/47636331/4445fc1b-9653-48ec-a035-998d47678f87) It seems the error is something to do with what Yorizuka said. `hasattr(torch, 'xpu')` returned false in my previous message, so it's not detecting PyTorch XPU whatsoever. These...

> Same problem. Python 12.3.4, CUDA 12.4/12.5. ENV variable CUDA_PATH set to cuda installation. Commenting cdll_args["winmode"] = ctypes.RTLD_GLOBAL in llama_cpp.py fixed it. Solved for me as well. Seems to be...