IndexError on Windows for llm chat
This is Python 3.12.0 on Windows 11 in a venv with "pip install llm llm-gpt4all".
llm -m Meta-Llama-3-8B-Instruct "What's a double dual?" works, but "llm chat -m Meta-Llama-3-8B-Instruct" results in:
(llm) C:\th\llm>llm chat -m Meta-Llama-3-8B-Instruct
Traceback (most recent call last):
File "
I am also seeing this error on Windows 11 (from Command Prompt or Powershell) when using llm chat -m modelname for both a local Meta-Llama-3-8B-Instruct model and the groq-openai-llama3 API.
Both models work fine when passing a prompt string as a command line argument.
@stungeye @th
This appears to be a limitation in the pyreadline3 package (which is only used on the windows version of llm, hence the problems do not show up on linux installs)
the pyreadline implementation is behind the offical GNU reline pacakge.
To work around this for now, I added the missing key indexes to a fork of pyreadline3
you can uninstall pyreadline3:
pip uninstall pyreadline3
and then install the fork:
pip install git+https://github.com/jplusc/pyreadline3.git
(details at https://github.com/jplusc/pyreadline3) (if you are using a virtual env, make sure your in the right env)
llm will now work as expected on windows:
C:\llm>llm chat
Chatting with gpt-3.5-turbo
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> Five ludicrous names for a pet lobster
1. Sir Pinchy Pincerson
2. Clawd McPinchface
3. Lobsterella McPincherson
4. Crustacean the Barbarian
5. Larry the Lobster King
> what if they were actually fiddler crabs?
1. Captain Crabby McPincherson
2. Fiddlesticks the Fiddler
3. Crabby Patty
4. Sir Pinch-a-lot
5. Claws McGraw
> exit
C:\llm>
@jplusc Thanks! I can confirm that using your fork of pyreadline3 fixes the issue and allows me to run llm in chat mode.