pramit-j2-sl
pramit-j2-sl
Same here. for Llama3.1 through ollama. In the UI, it shows "No Chunks Available", and in the logs as mentioned by others. verba-1 | ✔ Received query: hello verba-1 |...
> I was able to get this to work by using mxbai-embed-large for embedding. Should be as simple as 'ollama pull mxbai-embed-large'. I made both models available "llama3.1:8b" and "mxbai-embed-large:latest"....
Okay, I understood the issue, it probably requires exact name in env variable. In my case these are `OLLAMA_MODEL=llama3.1:8b`, `OLLAMA_EMBED_MODEL=mxbai-embed-large:latest` One out of context question, I see it searches only...
@mikowals Thanks. That worked. Planning to add the following in the developer doc. ``` MOJO_PATH=$(modular config mojo.path) \ && BASHRC=$( [ -f "$HOME/.bash_profile" ] && echo "$HOME/.bash_profile" || echo "$HOME/.bashrc"...