Woosung Joung
Woosung Joung
I appreciate that you made my snippet as a part of the new PR. Thx @xjamundx !
@luchaoqi I believe this is true. The following condition accounts for this behavior: `if hasattr(self, "store_attn_map") and encoder_hidden_states is not None:` Since `FluxSingleTransformerBlock` does not have `encoder_hidden_states`, attention map is...
@Dumeowmeow I believe so. I made a comment on #15 about visualizing the missing part. Hope it helps!
@PhysicalMouse Yup, same as mine