MindSearch icon indicating copy to clipboard operation
MindSearch copied to clipboard

Local LLM support

Open sarkaramitabh300 opened this issue 1 year ago • 8 comments

Hi, can you please provide a guide or support to use local llm models like Ollama lama3.1 8b or 70b

sarkaramitabh300 avatar Aug 12 '24 04:08 sarkaramitabh300

https://github.com/InternLM/lagent/pull/228/files

Harold-lkk avatar Aug 12 '24 05:08 Harold-lkk

@Harold-lkk @sarkaramitabh300

is there any example model.py that using local ollama models ? I changed and add ollama.py to lagent llms. need example for mindsearch model.py

MyraBaba avatar Aug 12 '24 07:08 MyraBaba

@Harold-lkk

ollama 3.1 8b serving and I have below errors:

JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. ERROR:root:Exception in sync_generator_wrapper: local variable 'response' referenced before assignment Traceback (most recent call last): File "/home/bc/Projects/ODS/MindSearch/mindsearch/app.py", line 69, in sync_generator_wrapper for response in agent.stream_chat(inputs): File "/home/bc/Projects/ODS/MindSearch/mindsearch/agent/mindsearch_agent.py", line 235, in stream_chat print(colored(response, 'blue')) UnboundLocalError: local variable 'response' referenced before assignment

MyraBaba avatar Aug 12 '24 09:08 MyraBaba

is there anyone succesfull with ollama 3.1 8b

MyraBaba avatar Aug 12 '24 10:08 MyraBaba

Are there any updates about supporting Ollama?

benlyazid avatar Aug 26 '24 11:08 benlyazid

FrontEnd use Streamlit. Edit the model_name of internlm_client in models.py to internlm2 and the url to http://127.0.0.1:11434. Then run the command python -m mindsearch.app --lang en --model_format internlm_client --search_engine DuckDuckGoSearch

Brzjomo avatar Aug 29 '24 15:08 Brzjomo

Thank you

benlyazid avatar Sep 02 '24 15:09 benlyazid

were anyone of you able to run with different ollama models successfully or using any other local setup, are follow ups working correctly ???. I tried , but i have only success with internlm2 model, rest of them are running into errors. Also using internlm2 7b seems to be hallucinating and running into loops.

kunwar-vikrant avatar Sep 23 '24 14:09 kunwar-vikrant