Engui
Engui
> Removing `--wbits 4 --groupsize 128` flags worked for me with the OPT model. I was getting same error. i got another error ``` Loading anon8231489123_gpt4-x-alpaca-13b-native-4bit-128g... Loading checkpoint shards: 0%|...
if you don't have hubert model ckpt, the problem occur. you have to download model ckpt.
join the discord server you can see the pre-trained-model channel model ckpt is there
> Turns out this line https://github.com/oobabooga/text-generation-webui/blob/main/extensions/api/script.py#L77 must be incorrect, because if I change it to > > ``` > response = json.dumps({ > 'results': [{ > 'text': answer > }]...