Local Ollama models are not agentic
Description
If I use opencode with BigPickle it works. If I use it with any Ollama model whether local or cloud, it is not able to see files. I know that ollama does allow tool calling, and have even programmed it myself, so there must be a bug in opencode that doesn't work with ollama. On linux.
OpenCode version
1.0.164
Steps to reproduce
-
Install ollama
-
ollama run nemotron-3-nano:30b-cloud
-
edit ~/.opencode/opencode.json and add "ollama": { "npm": "@ai-sdk/openai-compatible", "name": "Ollama (local)", "options": { "baseURL": "http://localhost:11434/v1" }, "models": {
"nemotron-3-nano:30b-cloud":{ "name": "nemotron-3-nano-cloud" }
} run it and have it write a hello world program or look at a hello world program. Can do it. Should be able to.
Screenshot and/or share link
No response
Operating System
No response
Terminal
No response
This issue might be a duplicate of existing issues. Please check:
- #5187: Ollama: User message content arrives as empty array - model cannot see user input
- #1034: Local Ollama tool calling either not calling or failing outright
- #3029: ollama tool calling issues
- #1068: Tool use with Ollama models
- #234: Tool Calling Issues with Open Source Models in OpenCode
- #4428: Why is opencode not working with local llms via Ollama?
- #729: Opencode in linux cannot access tools or create files
- #590: not being able to write files when using local models
Feel free to ignore if none of these address your specific case.
did you see our docs on ollama? Did you set num_ctx?
They by default give agent like 4k tokens which is unusable for coding
I do not see any docs at all for this project. Most github repo's have a docs directory that contain info.
For example ollama has a docs directory where it says how to set the num_ctx when using the api. https://github.com/ollama/ollama/blob/a013693f804b69f6af00f79623751277147972c5/docs/api.md?plain=1#L403
Docs link in readme: https://github.com/sst/opencode?tab=readme-ov-file#documentation
Ollama docs: https://opencode.ai/docs/providers/#ollama
In anycase, your point about the num_ctx can be solved in the api call. Maybe add a setting in the ollama parameters for context size.
previously they didnt allow setting it in their openai compliant endpoints or in ai sdk provider they maintain, ill have to see if this was fixed there used to be some blockers for this to be done simply
@rekram1-node FYI, I am currently using the vscode chatgpt-copilot extension for local, as opencode is not working locally. Initially, that extension was also broken, but updating the ollama provider to v2, and setting localhost:11434/api got it working. See https://github.com/feiskyer/chatgpt-copilot/pull/642
Ill check this out ty, @minger0 what package r u using? I think there are at least 3 packages
@rekram1-node it is in the PR, see https://github.com/feiskyer/chatgpt-copilot/pull/642/files
sweet ty ill see if we can use here easily
btw loving this project!