A bug in AdaLayerNorm
Describe the bug
diffusers/blob/main/src/diffusers/models/normalization.py The default value of dim parameter of chunk function is 0. The shape of scale and the shape of x doesn't match.
Reproduction
The shape of scale and the shape of x doesn't match.
Logs
No response
System Info
This is still in the main branch.
Who can help?
No response
Reproducible code snippet please.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Closing because of inactivity.
Here is a reproducable code
import torch
from diffusers.models.normalization import AdaLayerNorm
bs = 32
dim = 100
num_steps = 1000
x = torch.randn(bs, dim)
t = torch.randint(0, num_steps, (bs,))
norm = AdaLayerNorm(dim, num_steps)
output = norm(x, t)
Log
File "normalization.py", line 47, in forward
x = self.norm(x) * (1 + scale) + shift
RuntimeError: The size of tensor a (100) must match the size of tensor b (200) at non-singleton dimension 1