llm icon indicating copy to clipboard operation
llm copied to clipboard

IndexError on Windows for llm chat

Open th opened this issue 1 year ago • 3 comments

This is Python 3.12.0 on Windows 11 in a venv with "pip install llm llm-gpt4all".

llm -m Meta-Llama-3-8B-Instruct "What's a double dual?" works, but "llm chat -m Meta-Llama-3-8B-Instruct" results in:

(llm) C:\th\llm>llm chat -m Meta-Llama-3-8B-Instruct Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in run_code File "C:\th\llm\Scripts\llm.exe_main.py", line 7, in File "C:\th\llm\Lib\site-packages\click\core.py", line 1157, in call return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\th\llm\Lib\site-packages\click\core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "C:\th\llm\Lib\site-packages\click\core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\th\llm\Lib\site-packages\click\core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\th\llm\Lib\site-packages\click\core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\th\llm\Lib\site-packages\llm\cli.py", line 345, in chat readline.parse_and_bind("\e[D: backward-char") File "C:\th\llm\Lib\site-packages\pyreadline3\rlmain.py", line 112, in parse_and_bind self.mode._bind_key(key, func) File "C:\th\llm\Lib\site-packages\pyreadline3\modes\basemode.py", line 181, in _bind_key keyinfo = make_KeyPress_from_keydescr(key.lower()).tuple() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\th\llm\Lib\site-packages\pyreadline3\keysyms\common.py", line 138, in make_KeyPress_from_keydescr raise IndexError("Not a valid key: '%s'" % keydescr) IndexError: Not a valid key: '\e[d'

th avatar Apr 25 '24 16:04 th

I am also seeing this error on Windows 11 (from Command Prompt or Powershell) when using llm chat -m modelname for both a local Meta-Llama-3-8B-Instruct model and the groq-openai-llama3 API.

Both models work fine when passing a prompt string as a command line argument.

stungeye avatar Apr 26 '24 16:04 stungeye

@stungeye @th

This appears to be a limitation in the pyreadline3 package (which is only used on the windows version of llm, hence the problems do not show up on linux installs)

the pyreadline implementation is behind the offical GNU reline pacakge.

To work around this for now, I added the missing key indexes to a fork of pyreadline3

you can uninstall pyreadline3:

pip uninstall pyreadline3

and then install the fork:

pip install git+https://github.com/jplusc/pyreadline3.git

(details at https://github.com/jplusc/pyreadline3) (if you are using a virtual env, make sure your in the right env)

llm will now work as expected on windows:

C:\llm>llm chat
Chatting with gpt-3.5-turbo
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> Five ludicrous names for a pet lobster
1. Sir Pinchy Pincerson
2. Clawd McPinchface
3. Lobsterella McPincherson
4. Crustacean the Barbarian
5. Larry the Lobster King
> what if they were actually fiddler crabs?
1. Captain Crabby McPincherson
2. Fiddlesticks the Fiddler
3. Crabby Patty
4. Sir Pinch-a-lot
5. Claws McGraw
> exit

C:\llm>

jplusc avatar Jul 01 '24 23:07 jplusc

@jplusc Thanks! I can confirm that using your fork of pyreadline3 fixes the issue and allows me to run llm in chat mode.

stungeye avatar Jul 09 '24 12:07 stungeye