Blake

Results 12 comments of Blake

This also assumes you've downloaded the 'complete package' 7z for your platform (basically have a Python distro in your RVC folder in a subfolder called **runtime**). If you've manually cloned...

> `pip install packaging` then `pip install flash-attn --no-build-isolation`. No joy on this. it was missing wheel. Installing wheel package then just errored.

I think I find this too. I was actually having issues with LoRas being weak on Flux, and this fixed that aspect of it, but I'm noticing generation is noticeably...

> The error you're encountering is due to the split_torch_state_dict_into_shards function not being available in huggingface-hub version 0.20.3. This function is included starting from version 0.23.0. > > To resolve...

Turning on Automatic FP16 did indeed make the weights stronger in the fp8 model. They are practically the same now. ![flux1-dev-fp8 safetensors c 16](https://github.com/user-attachments/assets/355b8871-3154-4c10-ba9a-b01fae5015a1) ![flux1-dev-Q8_0 gguf c 16](https://github.com/user-attachments/assets/ab828d82-6077-42c3-a194-6bbc36b1104c) So is...

Another example of 230e3911 vs ba01ad37 ![230e3911](https://github.com/user-attachments/assets/d0afcab4-c90e-4311-b509-30b2f9b4a70d) ![ba01ad37](https://github.com/user-attachments/assets/80639827-5c4e-40b9-9b7c-78d2b591c174) Changing to LoRA 16bit fixes the issue, but the output differs slightly to the 8bit version. Switching to _flux1-dev-fp8.safetensors_ Lora 8bit and...

Just to add this is fixed now > As of the latest commit, they are not working for me at all, I have the correct VAE, encoder etc it appears...

It's over in the black-forest HF at [https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main ae.safetens](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/ae.safetensors)

> I did in fact have the smaller file, thank you No problem - glad it fixed it. I _think_ the smaller one worked up until a few days ago....

Ah good shout on ollama - I did have it installed but I'd changed the port it listens on - updating the config fixed that. I still can't run tui...