Taiwan-LLM
Taiwan-LLM copied to clipboard
Support for AWQ quantization in TGI
Hi As I tried with 13b version in TGI, it works fine with bitsandbytes quantization. While trying with AWQ quantization in TGI, it shows error as "Cannot load 'awq' weight, make sure the model is already quantized" I am wondering if AWQ is too new to this model while deploying by TGI Or there is any suggestion or comment? Thanks
For quantized model, i only tried with AWQ on vllm. you can find -awq model on my huggingface