Blake
Blake
This also assumes you've downloaded the 'complete package' 7z for your platform (basically have a Python distro in your RVC folder in a subfolder called **runtime**). If you've manually cloned...
> `pip install packaging` then `pip install flash-attn --no-build-isolation`. No joy on this. it was missing wheel. Installing wheel package then just errored.
I think I find this too. I was actually having issues with LoRas being weak on Flux, and this fixed that aspect of it, but I'm noticing generation is noticeably...
> The error you're encountering is due to the split_torch_state_dict_into_shards function not being available in huggingface-hub version 0.20.3. This function is included starting from version 0.23.0. > > To resolve...
Turning on Automatic FP16 did indeed make the weights stronger in the fp8 model. They are practically the same now.   So is...
Another example of 230e3911 vs ba01ad37   Changing to LoRA 16bit fixes the issue, but the output differs slightly to the 8bit version. Switching to _flux1-dev-fp8.safetensors_ Lora 8bit and...
Just to add this is fixed now > As of the latest commit, they are not working for me at all, I have the correct VAE, encoder etc it appears...
It's over in the black-forest HF at [https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main ae.safetens](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/ae.safetensors)
> I did in fact have the smaller file, thank you No problem - glad it fixed it. I _think_ the smaller one worked up until a few days ago....
Ah good shout on ollama - I did have it installed but I'd changed the port it listens on - updating the config fixed that. I still can't run tui...