divine-taco

Results 2 comments of divine-taco

c4ai-command-r-plus works if you bump Exllamav2 up to 0.0.18. May also fix support for c4ai-command-r-v01. Notably I've not able to get either models working inside text-generation-webui with regular transformers. Whilst...

I ran into an issue with DeepSeek-R1-UD-Q2_K_XL from unsloth/DeepSeek-R1-GGUF ``` llama_model_load: error loading model: missing tensor 'blk.0.attn_k_b.weight' llama_model_load_from_file_impl: failed to load model ```