Enju
Enju
> I had the same issue. The quickstart seems to install the cpu only version of pytorch by default. You need the cuda-enabled version of pytorch. Use pip/conda to uninstall...
> Has anyone found a workaround to this? another attempt with the huggingface transformer worked, maybe abit complicated also had to use a cpu version of a package
> num_gpus=0) doesn't work either `AssertionError: Torch not compiled with CUDA enabled`
> @Naugustogi any chance you can provide the full stack trace? it happened after i started the program normally with inference ``` import galai as gal model = gal.load_model(name =...
> Thanks @Naugustogi. The traceback shows `galai` 1.0.0. Can you try with 1.1.2? i'm not sure where to get that, in this repo, its just version 1.0.0 (3 weeks ago)
> @Naugustogi alright, 1.1.2 doesn't work either, it won't even show me any error, after starting, it returns the main folder
> what do you mean? If you are running it as a script, you need to wrap the last line in `print()`. ok it worked
removing, torch_dtype=torch.half also helps
How do i upgrade any model to the new version of ggjt 2? using gpt4-x-alpaca-13b-native-ggml-model-q4_0 (i'm now able to compile with cmake)