Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

[BUG] Ollama embedding error 500

Open IzzyHibbert opened this issue 1 year ago • 2 comments

I decided to open single and specific tickets instead of a unique one 2944 because after a bit of investigation the KO on different embeddings have nothing in common.

Test performed on Flowise 2.0.1

Case Flowise (docker) => Ollama (no-docker, just in Windows)

The Ollama endpoint is at http://192.168.178.76:11434 in the test I am keeping the LocalAI chat model and just using Ollama Embedding model (installed in Win, no docker).

The issue here is shown in this picture Screenshot 2024-08-07 alle 09 48 43 while FLowise error detail being displayed in Docker is Screenshot 2024-08-07 alle 09 49 31

Now let's repeat the same in Postman. The endpoint , in Flowise I use http://192.168.178.76:11434, in Postman is visible here. The Postman test is successful.

Screenshot 2024-08-07 alle 09 50 53

Looks like others reported already the same problem (Error 500), see discord users :

Screenshot 2024-08-08 alle 10 20 14

Screenshot 2024-08-08 alle 10 19 24

Screenshot 2024-08-08 alle 10 19 08

IzzyHibbert avatar Aug 08 '24 08:08 IzzyHibbert

I made further tests here after the discoveries of #2978, #2977.

The reported tests come from the following model Ollama. Ollama is on a Win machine (no docker).

The page doesn't tell the vector. Very likely 512 or 768. As soon as I changed the vector dimension of in the Vector store, it changed the error to this :

Screenshot 2024-08-08 alle 14 37 04

Or this in debug

2024-08-08 12:36:59 [ERROR]: [server]: Error: TypeError: fetch failed
Error: TypeError: fetch failed
    at buildFlow (/usr/local/lib/node_modules/flowise/dist/utils/index.js:483:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async utilBuildChatflow (/usr/local/lib/node_modules/flowise/dist/utils/buildChatflow.js:227:36)
    at async createInternalPrediction (/usr/local/lib/node_modules/flowise/dist/controllers/internal-predictions/index.js:7:29)


IzzyHibbert avatar Aug 08 '24 12:08 IzzyHibbert

I havent tried Ollama on Windows, but was able to get it working when I have both FW and Ollama on docker: http://host.docker.internal:8000/

https://docs.flowiseai.com/integrations/langchain/chat-models/chatollama#additional

HenryHengZJ avatar Aug 09 '24 11:08 HenryHengZJ

I used the mdoel bge-m3 and works.

IzzyHibbert avatar Oct 10 '24 10:10 IzzyHibbert