LocalAI
LocalAI copied to clipboard
Output from reasoning models not parsed correctly
LocalAI version: 3.9.0
Environment, CPU architecture, OS, and Version: Not relevant
Describe the bug
When using a reasoning model like Qwen-3*, the response is not included in the output the way it would be expected according to the OpenAI api. Instead of placing the content of the
To Reproduce Run qwen3-*b
Expected behavior Localai should return the response according to the OpenAI specification.
Logs N/A
Additional context N/A