heiqs

Results 2 comments of heiqs

post more info on your system, your hosts file, and your .env. Make sure 127.0.0.1 is mapped to localhost.

Why we cannot run orher models like llama3.2 when using ollama? `{"message": "400: Invalid value: Could not find provider for Llama3.1-8B-Instruct"}}`. Only Llama3.1-8B-Instruct works when using ollama as inference.