llama.cpp
llama.cpp copied to clipboard
wip RoPE
almost-ish working RoPE support.
There is one problem with this:
src0: (64,16,2,1)
src1: (2,1,1,1)
out: (64,16,2,1)
Shape([1, 1, 2, 64])
pre rope ttnn
Always | FATAL | Mismatched tensor shapes for node 'Qcur-0' (ROPE): GGML wants [64, 16, 2, 1], TTNN generates Shape([1, 2, 32, 64])
I tried to fix it but my attempts only made it worse. Please you fix it