[BUG] Ollama embedding error 500
I decided to open single and specific tickets instead of a unique one 2944 because after a bit of investigation the KO on different embeddings have nothing in common.
Test performed on Flowise 2.0.1
Case Flowise (docker) => Ollama (no-docker, just in Windows)
The Ollama endpoint is at http://192.168.178.76:11434 in the test I am keeping the LocalAI chat model and just using Ollama Embedding model (installed in Win, no docker).
The issue here is shown in this picture
while FLowise error detail being displayed in Docker is
Now let's repeat the same in Postman. The endpoint , in Flowise I use http://192.168.178.76:11434, in Postman is visible here. The Postman test is successful.
Looks like others reported already the same problem (Error 500), see discord users :
I made further tests here after the discoveries of #2978, #2977.
The reported tests come from the following model Ollama. Ollama is on a Win machine (no docker).
The page doesn't tell the vector. Very likely 512 or 768. As soon as I changed the vector dimension of in the Vector store, it changed the error to this :
Or this in debug
2024-08-08 12:36:59 [ERROR]: [server]: Error: TypeError: fetch failed
Error: TypeError: fetch failed
at buildFlow (/usr/local/lib/node_modules/flowise/dist/utils/index.js:483:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async utilBuildChatflow (/usr/local/lib/node_modules/flowise/dist/utils/buildChatflow.js:227:36)
at async createInternalPrediction (/usr/local/lib/node_modules/flowise/dist/controllers/internal-predictions/index.js:7:29)
I havent tried Ollama on Windows, but was able to get it working when I have both FW and Ollama on docker: http://host.docker.internal:8000/
https://docs.flowiseai.com/integrations/langchain/chat-models/chatollama#additional
I used the mdoel bge-m3 and works.