James Liu

Results 1 issues of James Liu

Is there an easy way to dequantize the model weights back to fp16 format? Specifically for Exllama I'm looking for something similar to what [bitsandbytes does](https://github.com/huggingface/blog/blob/main/hf-bitsandbytes-integration.md), reproduced below: > The...