qifuxiao
qifuxiao
I run it in colab, config.yml: ```yaml models: - type: main engine: huggingface_hub model: baichuan-inc/Baichuan2-7B-Chat rails: input: flows: - self check input output: flows: - self check output ``` jupyter...
I run phi3 :ollama run phi3:3.8b And I test: `curl http://localhost:11434/api/generate -d '{ "model": "phi3:3.8b", "prompt": "Why is the sky blue?", "stream": false }'` , I can got response, Then...
### Describe your problem 我使用docker compose顺利完成了ragflow的的部署,并且使用ollama完成了添加模型,测试任务正常,我现在想使用Xinference框架进行模型管理,当我添加的时候报错Fail to access embedding model(bge-large-zh-v1.5).Connection error.   我查看了ragflow-server的日志: NoneType: None 2025-02-28 10:04:43,961 INFO 14 172.19.0.6 - - [28/Feb/2025 10:04:43] "POST /v1/llm/add_llm HTTP/1.1" 200 -...