Taiwan-LLM icon indicating copy to clipboard operation
Taiwan-LLM copied to clipboard

Support for AWQ quantization in TGI

Open nigue3025 opened this issue 1 year ago • 1 comments

Hi As I tried with 13b version in TGI, it works fine with bitsandbytes quantization. While trying with AWQ quantization in TGI, it shows error as "Cannot load 'awq' weight, make sure the model is already quantized" I am wondering if AWQ is too new to this model while deploying by TGI Or there is any suggestion or comment? Thanks

nigue3025 avatar May 14 '24 01:05 nigue3025

For quantized model, i only tried with AWQ on vllm. you can find -awq model on my huggingface

adamlin120 avatar May 16 '24 02:05 adamlin120