diffusers
diffusers copied to clipboard
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch.
**Is your feature request related to a problem? Please describe.** I want to use and test out the t2i adapter with my safetensor model. I used to use diffusers models,...
**Is your feature request related to a problem? Please describe.** The [original controlnet training tutorial](https://github.com/lllyasviel/ControlNet/blob/main/docs/train.md#other-options) mentions the options * `only_mid_control=True`: _This can be helpful when your computation power is limited...
# What does this PR do? Updates all references to torch.FloatTensor to torch.Tensor. This is because FloatTensor is essentially deprecated and all Tensor types are actually just Tensors now. FloatTensor...
https://github.com/huggingface/diffusers/blob/d8d208acdea43a274bd74ed27119a4f95e8e0946/src/diffusers/models/attention_processor.py#L517 https://github.com/huggingface/diffusers/blob/d8d208acdea43a274bd74ed27119a4f95e8e0946/src/diffusers/models/attention_processor.py#L1202 hi! Is the call method of the AttnProcessor2_0 class missing the parameter ip-adapter_masks? When I use it, it will print a warning log **`cross_attention_kwargs ['ip_adapter_masks'] are not expected...
# What does this PR do? Set max parallel T4 jobs used in slow pipeline tests so that we don't exhaust the pool. Fixes # (issue) ## Before submitting -...
**Is your feature request related to a problem? Please describe.** When i feed the `out_dim` argument in `__init__` in [Attention block](https://github.com/huggingface/diffusers/blob/b69fd990ad8026f21893499ab396d969b62bb8cc/src/diffusers/models/attention_processor.py#L114) it will raise the shape error, because the `query_dim...
We are training text_to_image on Google cloud platform, the jupyterlab instance has 2 GPUs (NVIDIA Tesla P100) with a total memory of 32GB (16GB each). I tried using accelerate for...
currently, `StableCascadeCombinedPipeline` is missing `_optional_components` so the combined pipeline is not able to get the correct list of expected modules, i.e. `_get_signature_keys` (https://github.com/huggingface/diffusers/blob/c1c42698c955959d7ef34af129428f64c6e363bf/src/diffusers/pipelines/pipeline_utils.py#L1555) will not include the optional components in...
### Describe the bug I accidentally introduced a bug in this [PR](https://github.com/huggingface/diffusers/pull/5181) by making a condition on [this line](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_flax.py#L238), which is needed when `use_memory_efficient_attention=True`. Adding this bug to remind myself...
# What does this PR do? Check here: https://github.com/huggingface/diffusers/pull/7703#discussion_r1584501268.