Misc. bug: webui: extreme sluggish performance typing into textarea with long-context conversations
Name and Version
version: 4690 (4078c77f) built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
llama-server -m <any model>
Problem description & steps to reproduce
The user textarea has extreme sluggish performance when typing when the conversation above has a lot of context.
- Write a very long message (16K+ tokens) and send.
- Try typing into the textarea. Sluggish performance. Are we rerendering a component like ChatScreen on each keystroke and doing complex computations on the previous messages? I can't easily spot it myself if we are.
First Bad Commit
New React UI commit but not entirely sure
Relevant log output
@ngxson I haven't had time to investigate the root cause but this is still a pretty big problem, after a few turns of code writing the interface becomes noticeably slow.
Hello
El jue, 20 de mar de 2025, 10:59 a. m., Xuan-Son Nguyen < @.***> escribió:
Closed #11813 https://github.com/ggml-org/llama.cpp/issues/11813 as completed via #12299 https://github.com/ggml-org/llama.cpp/pull/12299.
— Reply to this email directly, view it on GitHub https://github.com/ggml-org/llama.cpp/issues/11813#event-16912556968, or unsubscribe https://github.com/notifications/unsubscribe-auth/BQFRXQ4TY7NWN6ZXKYHJPLD2VLJVDAVCNFSM6AAAAABW6V6NAGVHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJWHEYTENJVGY4TMOA . You are receiving this because you are subscribed to this thread.Message ID: @.***>