Vinicius Luiz Pacheco
Results
1
comments of
Vinicius Luiz Pacheco
Hi there, what if I am using local LLM such as: ``` llm_deepseek = Ollama( model="deepseek-r1:1.5b", base_url="http://localhost:11434", temperature=0.7, ) ``` ... and got the same: `ERROR:root:Failed to get supported params:...