Unable to configure an Ollama embedder & generator
Description
Hey everyone, I just cloned Verba locally and set the environment variables. I want to use Ollama for Embedding and Generation (using Llama3) but I cannot see where to choose the Ollama generator model from the settings after running my Verba instance. Ollama is running at http://localhost:11434
Did anyone have the same problem? Please let me know what I missed. Thanks!
Is this a bug or a feature?
Bug
Thanks for the issue! Does the Overview page show that both OLLAMA_URL and OLLAMA_MODEL are set?
@thomashacker Yes they're both set and, now I can see them in the frontend but, they don't work for some reason. How can I use Ollama for both generation and embedding because in the .env file example the model name is specified there whereas I cannot use the same model for both embeddings and generation.
Second thing is that I can't switch between multiple embedding models once Ollama is selected I can't select another model.
Is it possible for you to change the Embedder and Generator in the RAG tab? Also make sure to press Save after making changes. And can you provide more information about what you mean by "Can't select another model"?