Бондар Нікіта Андрійович
Бондар Нікіта Андрійович
> Transformers/Accelerate does this in CPU mode too? Ouch. > > edit: Hey.. so a shot in the dark, did you try with https://github.com/zphang/transformers.git@68d640f7c368bcaaaecfc678f11908ebbd3d6176 > > That transformer would use...
```powershell > python server.py --share --model oasst-sft-1-pythia-12b --cpu --load-in-8bit Loading oasst-sft-1-pythia-12b... Loading checkpoint shards: 33%|████████████████████████████████████████████████▎ | 1/3 [07:52
> Not by fresh github account with shady name. Lol, why you are so rude? Yes Im new here, but that is not concern to blame me...I simply dont know...
Thank you > Hello, you can find this 13B one here: https://huggingface.co/samwit/alpaca13B-lora > > Otherwise, there is the 7B one here: https://huggingface.co/tloen/alpaca-lora-7b > > Please note these are LoRA models...