Model not returning the answers.
Question
Description When using qwen2.5-coder:14b with OpenCode, the model attempts to call tools (like todorantum) for simple conversational inputs instead of responding normally. Steps to Reproduce
Set up OpenCode with Ollama provider Use model: qwen2.5-coder:14b Send a simple greeting: hi Model responds with a tool call instead of a direct response
Expected Behavior User: hi Assistant: Hello! How can I help you today? Actual Behavior User: hi Assistant: { "name": "todorantum", "arguments": { "input": "hi" } } Environment
Provider: Ollama Model: qwen2.5-coder:14b Ollama Version: 0.13.5 GPU: NVIDIA RTX 5070 Ti (16GB VRAM) OS: Arch Linux
Model
architecture qwen2
parameters 14.8B
context length 32768
embedding length 5120
quantization Q4_K_M
Capabilities
completion
tools
insert
System You are Qwen, created by Alibaba Cloud. You are a helpful assistant.
License
Apache License
Version 2.0, January 2004
...
"provider": { "ollama": { "npm": "@ai-sdk/openai-compatible", "name": "Ollama (local)", "options": { "baseURL": "http://localhost:11434/v1" }, "models": { "qwen2.5-coder:14b": { "name": "qwen2.5-coder:14b", "tools": true, "thinking": true, } }, } }
This issue might be a duplicate of existing issues. Please check:
- #5694: Local Ollama models are not agentic - Similar problem with Ollama models not functioning correctly
- #2728: Cannot use tools with qwen2.5-coder and Ollama - Directly related to the qwen2.5-coder model showing tool call JSON instead of executing
- #1034: Local Ollama tool calling either not calling or failing outright - Similar pattern where Ollama models generate tool calls but don't execute them properly
Feel free to ignore if your specific case is different from these.
what is the num ctx u set for the model?