rony432
rony432
latest update working, thank you.(on codespace is very slow, as usual 👍 :) inside terminal pull models for ollama first run this export OPENAI_API_KEY=xxx then python -m openui . in...
for docker get into container terminal and run this export OPENAI_API_KEY=xxx Kill ollma then re run ollama serve then try again. i got 500 error when it could not find...
note for dev. increase reply timeout when ollama is active.
i have ollama but not on this machine(its remote). i am looking to configure it as custom endpoint. is there a sample config?
check this out https://docs.crewai.com/how-to/LLM-Connections/#setting-up-ollama in crewai user can change the endpoint as global or per agent. please add support model change for fabric too side quest. integration as tools for...
the update was very fast. thank you a few improvement notes there should be a ~/fabric.conf file with contents of end point addresses for ollama endpoints ``` #--remoteOllamaServer default="http://localhost:11434" #default...
to be openai compatible endpoint on ollama it still needs key try this in your curl request "OPENAI_API_KEY": "NA" it should get around error, question for the dev. does fabric...
> I run Fabric on windows WSL, and run Ollama on native windows. "> fabric --listmodel" don't seem to list any models from ollama. > > How do I setup...
im interested too
with new codespace not working. problem is official OPENAI package, all project that use it have problems with Ollama. ollama installed manually. ``` /workspaces/openui (main) $ ollama list NAME ID...