diffusers
diffusers copied to clipboard
Removing LoRAAttnProcessor causes many dependencies to fail
Describe the bug
https://github.com/huggingface/diffusers/pull/8623 removed obsolete LoRAAttnProcessor which in principle is a good thing, but it was done without considerations where is that feature currently in-use so it breaks many (and i mean many) community pipelines
it also breaks some core libraries such as huggingface's own https://github.com/huggingface/optimum library which is used to export model to onnx and also to enable use of olive backend.
suggestion is to add a dummy class LoRAAttnProcessor so it results in no-op for packages that import it.
Reproduction
N/A
Logs
> Failed to import optimum.onnxruntime.modeling_diffusion because of the following error (look up to see its traceback):
> Failed to import optimum.exporters.onnx.__main__ because of the following error (look up to see its traceback):
> cannot import name 'LoRAAttnProcessor' from 'diffusers.models.attention_processor' (/home/vlado/dev/sdnext/venv/lib/python3.12/site-packages/diffusers/models/attention_processor.py)
System Info
diffusers==0.30.0.dev0
Who can help?
@yiyixuxu @sayakpaul @DN6
Will look into it.