[enhancement]: Add support for HiDream
Is there an existing issue for this?
- [x] I have searched the existing issues
Contact Details
No response
What should this feature add?
HiDream is a new trending image generative foundation model with 17B parameters that achieves state-of-the-art image generation quality within seconds. I think we should support it.
Alternatives
No response
Additional Content
No response
Have heard more and more about HiDream recently. It's a big heft of a model though, so might not be fully accessible for folks until we land on quantized versions.
Do you have interest in contributing to support it?
There are already quantized versions that fit 16GB cards (and 24GB cards):
https://huggingface.co/azaneko/HiDream-I1-Full-nf4 (full version recommends 50 steps) https://huggingface.co/azaneko/HiDream-I1-Dev-nf4 (28 steps) https://huggingface.co/azaneko/HiDream-I1-Fast-nf4 (16 steps)
Full, Dev and Fast versions have the same size but distilled versions (Dev and Fast) need less steps.
Got here via Google search when looking if InvokeAI support Hidream. Would be amazing to see it added.
By the way, there is FP8 quant that fits on 24GB and 32GB cards, providing higher quality than nf4 version (mentioning here as an example so if support for HiDream is added, hopefully both nf4 and fp8 quants would be recognized correctly):
https://huggingface.co/shuttleai/HiDream-I1-Full-FP8
I would like to second this.
any update?
A contributor interested in implementing a node and support for this has not yet presented themselves.
I have been getting really good stuff out of HiDream in Comfy and would love to have it in invoke so it would be easier to work with.
Hidream is amazing. I think adding it would be great.
Currently this seems to be more like gatekeeping. Why not add it with an experimental label and let the community iron it out? On the other hand, how does the InvokeAI's support model works actually? People invest their own time free of charge as open source community and Invoke makes money out of it which is fine, but when it comes to adding support for a new model, we solely rely on the community? That sounds a bit concerning and not sustainable. Imho, this is a fast track to lose the support from the same community. - I hope I get it totally wrong.