Could Not Pull Model
Describe the bug
Upon cloning the repo (commit 0b695f3eff6a8595631eabb2b64b8598f5cc43ed) and running the docker container I get to the hosted web page and upon issuing any queries I get the following error on the web page:
Model nomic-embed-text:v1.5 does not exist and could not be pulled: Post "http://host.docker.internal:11434/api/pull": dial tcp 172.17.0.1:11434: connect: connection refused
and the following show on the console output:
llocalsearch-backend-1 | 2024/04/06 02:33:05 WARN Model does not exist, pulling it model=nomic-embed-text:v1.5 llocalsearch-backend-1 | 2024/04/06 02:33:05 ERROR Model does not exist and could not be pulled model=nomic-embed-text:v1.5 error="Post \"http://host.docker.internal:11434/api/pull\": dial tcp 172.17.0.1:11434: connect: connection refused"
To Reproduce Steps to reproduce the behavior:
- Go to the web front end @ http://localhost:3000/ after cloning the repo and running docker-compose up
- Click on any of the recommended queries or enter your own and hit enter
- See error displayed on the webpage and the error shown on the console from where the container was spooled up.
Expected behavior Expect the query to be responded to.
Screenshots N/A
Additional context
Tried switching to the tag v0.1 and had same issue but with different output.
Web page showed:
Model nomic-embed-text:v1.5 does not exist and could not be pulled: Post "http://192.168.0.109:11434/api/pull": dial tcp 192.168.0.109:11434: connect: no route to host
Console showed:
llocalsearch-backend-1 | 2024/04/06 02:46:12 WARN Model does not exist, pulling it model=nomic-embed-text:v1.5 llocalsearch-backend-1 | 2024/04/06 02:46:15 ERROR Model does not exist and could not be pulled model=nomic-embed-text:v1.5 error="Post \"http://192.168.0.109:11434/api/pull\": dial tcp 192.168.0.109:11434: connect: no route to host"
Looks like the backend isn't able to connect to ollama. Can you show me your ollama start command, maybe it's not listening on the right interface?
Btw fantastic bug report :)
Looks like the backend isn't able to connect to ollama. Can you show me your ollama start command, maybe it's not listening on the right interface?
Btw fantastic bug report :)
Thanks for the fast response, I appreciate it!
FYI I am a professional developer but frankly was just trying to plug and play this, so I apologize I haven't looked too much in depth for my issue before reaching out. Was hoping it was maybe an easy fix. Regardless happy to work through this as it seems like a great project you got going here!
How can I best answer your question regarding the "ollama start command"? Where can I find this info? For what it's worth, I'm not doing anything past whatever is provided by default.
OH! I apologize...I just saw the requirements portion of the readme (DO'H) stating "A running Ollama server, reachable from the container". Sorry about that, I'm sure this is the issue. I'll give a shot at running this and will report back.
Yep that was it...wasn't running the Ollama server. Thanks again for your help...closing issue.
No problem :). I know the setup process isn't there yet. Im working on a couple of tutorials especially for the whole ollama and networking part :)
Commenting here because I'm seeing a very similar issue, though not exactly the same!
My Ollama server is reachable from the Llocalsearch backend container:
~ # env
HOSTNAME=1654d00d8c40
SHLVL=1
HOME=/root
OLLAMA_HOST=http://host.docker.internal:11434
CHROMA_DB_URL=http://chromadb:8000
TERM=xterm
SEARXNG_DOMAIN=http://searxng:8080
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
MAX_ITERATIONS=30
PWD=/root
~ # nc host.docker.internal 11434
GET / HTTP/1.1
Host: host.docker.internal
HTTP/1.1 200 OK
Content-Type: text/plain; charset=utf-8
Date: Sat, 20 Apr 2024 01:08:07 GMT
Content-Length: 17
Ollama is running
As you can infer from above, I also updated the docker-compose config to set the OLLAMA_HOST envvar. When I visit the webui, the model I have installed in Ollama (Llama3) is shown in the settings dialog! However, when I select it from the dropdown, close the modal, and try to make a query, I am greeted with an error:
Model does not exist and could not be pulled: model is required.
The error console for the backend container gives a little hint:
backend-1 | 2024/04/20 01:10:04 WARN Model does not exist, pulling it model=""
backend-1 | 2024/04/20 01:10:04 ERROR Model does not exist and could not be pulled model="" error="model is required"
So to me it looks like no model name is getting passed when I select the model in the frontend. Am I missing a config step here? I'm doing this on Windows 10 (God help me!)
Let me know if I can provide more useful information; in the meantime I'll continue investigating on my own.
Following up to capture what I've seen; I think I will open a new issue as this does seem to be a bit different than the OP's issue. I'll open a new issue tomorrow and move this commentary there, but I just want to capture what I see.
I tried using docker-compose.dev.yaml instead of docker-compose.yaml, and the main functionality worked;
- Search occurred.
- Sources are added
- The vector DB is searched and a response is formulated!
I notice however that when using this compose file, the settings dialog doesn't appear when I click the model name in the upper left corner of the webui. (I can select a model, but I can't specify any parameters).
What I see:
What I'd expect:
Regardless of the compose file used, I notice that the backend call does receive the correct model name (line 1 of the below output), but subsequently seems to drop it (lines 2, 3 of below output).
backend-1 | 2024/04/20 06:07:51 INFO Client settings settings="{ContextSize:8192 MaxIterations:30 ModelName:llama3:latest Prompt:What is \"LLocalSearch\"? Session:default Temperature:0 ToolNames:[] WebSearchCategories:[] AmountOfResults:4 MinResultScore:0.5 AmountOfWebsites:10 ChunkSize:300 ChunkOverlap:100 SystemMessage: 1. Format your answer in markdown.\n2. You have to use your tools to answer questions.\n3. You have to provide sources / links you've used to answer the question.\n4. You may use tools more than once.\n5. Answer in the same language as the question.}"
backend-1 | 2024/04/20 06:07:51 WARN Model does not exist, pulling it model=""
backend-1 | 2024/04/20 06:07:51 ERROR Model does not exist and could not be pulled model="" error="model is required"```