maxwellVisual

Results 1 comments of maxwellVisual

Me too. I'm using gpt-oss-20b with llama.cpp (llama-server), it seems like llama-server have stricter parser. LiteLLM error message: ```log 06:32:45 - LiteLLM Proxy:ERROR: common_request_processing.py:699 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.InternalServerError:...