MiniCPM
MiniCPM copied to clipboard
[Feature Request]: 支持使用ollama进行部署
Feature request / 功能建议
对模型的性能很期待,期望能提供使用ollama部署的方法。
我们正在积极和llama.cpp 以及ollama合作,尽早支持ollama调用。感谢关注! We are actively collaborating with llama.cpp and ollama to support ollama calls as soon as possible. Thank you for your attention!
We have supported llama.cpp, see this section. Ollama still have some issue. 我们已经支持了llama.cpp, 见这个章节. Ollama仍然有一些问题
Could you try this one? https://ollama.com/roger/minicpm
不正常输出