How to use ollama
I successfully used groq, now I want to use ollama, I don't know how to configure ollama in keys.ts
ollama: 'http://localhost:11434', didn't work.
@liuyu-2003 - try ollama: "http://localhost:11434/api/chat",
@liuyu-2003 - try ollama: "http://localhost:11434/api/chat",- 尝试 Ollama:"http://localhost:11434/api/chat"
In addition to setting this, is there anything else that needs to be changed? Which local model of ollama does it call by default? Or how to set the default model?
You can run 'ollama start' and then 'ollama run llava-llama3'. Now you can access the llava-llama3 model via url. Before that you can also run 'ollama pull llava-llama3' to download the model.