llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

wip RoPE

Open alex-s168 opened this issue 10 months ago • 0 comments

almost-ish working RoPE support.

There is one problem with this:

src0: (64,16,2,1)
src1: (2,1,1,1)
out: (64,16,2,1)
Shape([1, 1, 2, 64])
pre rope ttnn
                 Always | FATAL    | Mismatched tensor shapes for node 'Qcur-0' (ROPE): GGML wants [64, 16, 2, 1], TTNN generates Shape([1, 2, 32, 64])

I tried to fix it but my attempts only made it worse. Please you fix it

alex-s168 avatar Mar 12 '25 21:03 alex-s168