Jongeun Baek

Results 3 comments of Jongeun Baek

I am using llama3 (70B or 8B) on ollama configs is below ```yaml llm: max_retries: 1000000 max_tokens: 1024 model: llama3:70b-instruct-fp16 platform: open_webui temperature: 0 top_p: 1 ```

I use langgraph and customized it. I don't think it's possible to reproduce unless I give you the whole file. Is it possible to just turn off? but here's a...