koesn
koesn
> Change the function in modules/chat.py > > ```python > def replace_character_names(text, name1, name2): > text = text.replace('{{user}}', name1).replace('{{char}}', name2) > return text.replace('', name1).replace('', name2) > ``` > > to...
Why this still unsupported? I'm running LM Studio to dedicate a GPU using tensor split 0,35 so I can fully offload Mistral 32k context to a 3060. I hope there's...
@dhiltgen Thank you, CUDA_VISIBLE_DEVICES works. Finally.
Seems MindMac already abandoned by dev.
I was struggling with this, but found a fix. That's because system prompt and the last message is not added to chat_dialogue in completions.py script. I add a bit of...
> I worked around it by specifying the chatML template for the model in the python myself. It now runs well, but still a bit inconsistent with output lengths, sometimes...
That's because last message is not added to chat_dialogue in completions.py script. I add a bit of code to append it, check on [completions.py patch](https://github.com/Koesn/text-generation-webui/blob/main/extensions/openai/completions.py). Check if it helps.