Matthieu Beaumont
Matthieu Beaumont
> @wrapss Models show up now but when trying to chat with them I get u launch ollama with arg OLLAMA_ORIGINS=* ?
Try opening http://ip:3000/api/localhost/ollama. What's the result? (ctrl + f5 for clear cache)
oh but you're in production mode I just understood see https://github.com/mckaywrigley/chatbot-ui/blob/main/app/api/localhost/ollama/route.ts ligne 4
try to remove the condition if (process.env.NODE_ENV !== "production") {}
@mckaywrigley is this condition really necessary?
do you see the chat request arriving on the ollama logs? or do you only see the tag request?
supabase is installed on which machine? the address set in the .env evolves accordingly.
so in your environment file NEXT_PUBLIC_SUPABASE_URL is the ipv4 of the machine on which supabase is installed, not localhost?
and if you try to launch chatbot-ui with the command npx next dev -H 0.0.0.0 and access it without ssh tunnel?
u need supabase-cli for run npm run db-types