richardstevenhack
richardstevenhack
Second this. Also provide support for OpenAI API-compatible local models and servers such as MSTY, LMStudio, AnythingLLM, etc. All that requires is the ability to change model names and make...
I second this. I looked at the docs about "Ollama integration", but all that does is set up the server endpoint. You can't select an Ollama model already downloaded where...
Yup, that's the idea.
" I was able to symlink ollama models to Jan using https://github.com/sammcj/gollama" I use gollama to link to LMStudio. How did you use it to link to Jan? Did you...
I agree. If everyone rallied around Ollama as the main AI server for PCs, and other programs concentrated on the UI and additional features on top, things would be easier....
I followed the above advice and I now see that Jan has added the option when importing to: `Keep Original Files & Symlink You maintain your model files outside of...
I'll reiterate - this does NOT work. See the attached screenshots.
I tried again, but this time I set the provider to Ollama instead of OpenAI-compatible - and that does work. So I think the problem is resolved. However, you might...
I have to say, this PO Token stuff is ridiculous. I tried following the PO Token Guide, and it simply doesn't reflect what I'm seeing in the Network tab of...
Well, this is weird. Just updated to the latest [today] version - and suddenly the video that was giving me 403 is downloading... Did you guys just fix this?