rplescia
rplescia
I thought it was weird too. I'm using the all-MiniLM-L6-v2 model that comes with 'full' build. I can see the document chunks and the embedding seems to run without any...
I have done more testing, and it affects some models, not others. I have tested with all the built-in models with the same parameters and the same file, Nomic and...
> And by the way, when I tried > > `docker build -f Dockerfile.scratch -t infiniflow/ragflow:dev .` > > I got the following error: > > ``` > 61.91 npm...
To fix this issues change 1 thing in Dockerfile.scratch. Line 22: Change "RUN curl -sL https://deb.nodesource.com/setup_14.x | bash -" to "RUN curl -sL https://deb.nodesource.com/setup_20.x| bash -" After that the build...
> user management method, currently if I expose it to the internet, anyone can register PostgresSQL support I agree with user management, this feature is needed very much. Even if...
This is also something I'm interested in. OAuth2 and LDAP. specifically for me
@KevinHuSh Hi, I'm still having this issue in 0.15.1. When I use Ollama (llama3.1), but also when I use **Together.ai** LLM (llama 3.3) it has the same issue
@KevinHuSh This is still a bug for me in 0.15.1 using either **Ollama** or **Together.ai** as an inference server. 2025-01-20 13:31:37,682 INFO 17 HTTP Request: POST http://ollamainference:11434/v1/chat/completions "HTTP/1.1 200 OK"...
I tried it with Together.ai with llama3.3, the same thing happens.
Support for Azure Blob storage would be greatly appreciated