Jongeun Baek
Jongeun Baek
did you solve the problem?
bug: Langfuse was not able to parse the LLM model. The LLM call will be recorded without model name.
I am using llama3 (70B or 8B) on ollama configs is below ```yaml llm: max_retries: 1000000 max_tokens: 1024 model: llama3:70b-instruct-fp16 platform: open_webui temperature: 0 top_p: 1 ```
bug: Langfuse was not able to parse the LLM model. The LLM call will be recorded without model name.
I use langgraph and customized it. I don't think it's possible to reproduce unless I give you the whole file. Is it possible to just turn off? but here's a...