gwjr
gwjr
You can work round this by patching the `_build_causal_attention_mask()` function in `CLIPTextTransformer` as follows: ```python def _build_causal_attention_mask(self, bsz, seq_len, dtype): # lazily create causal attention mask, with full attention between...
This has now been patched in OCLP - see [patchset](https://github.com/dortania/OpenCore-Legacy-Patcher/pull/777/commits) and accompanying [blogpost](https://khronokernel.github.io/macos/2021/12/08/5K-UEFI.html). It's not immediately obvious, though, how to bridge that across to Kryptonite.