opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Local Ollama models are not agentic

Open iplayfast opened this issue 1 month ago • 11 comments

Description

If I use opencode with BigPickle it works. If I use it with any Ollama model whether local or cloud, it is not able to see files. I know that ollama does allow tool calling, and have even programmed it myself, so there must be a bug in opencode that doesn't work with ollama. On linux.

OpenCode version

1.0.164

Steps to reproduce

  1. Install ollama

  2. ollama run nemotron-3-nano:30b-cloud

  3. edit ~/.opencode/opencode.json and add "ollama": { "npm": "@ai-sdk/openai-compatible", "name": "Ollama (local)", "options": { "baseURL": "http://localhost:11434/v1" }, "models": {

     		"nemotron-3-nano:30b-cloud":{
     			"name": "nemotron-3-nano-cloud"
     		}
    

} run it and have it write a hello world program or look at a hello world program. Can do it. Should be able to.

Screenshot and/or share link

No response

Operating System

No response

Terminal

No response

iplayfast avatar Dec 17 '25 15:12 iplayfast

This issue might be a duplicate of existing issues. Please check:

  • #5187: Ollama: User message content arrives as empty array - model cannot see user input
  • #1034: Local Ollama tool calling either not calling or failing outright
  • #3029: ollama tool calling issues
  • #1068: Tool use with Ollama models
  • #234: Tool Calling Issues with Open Source Models in OpenCode
  • #4428: Why is opencode not working with local llms via Ollama?
  • #729: Opencode in linux cannot access tools or create files
  • #590: not being able to write files when using local models

Feel free to ignore if none of these address your specific case.

github-actions[bot] avatar Dec 17 '25 15:12 github-actions[bot]

did you see our docs on ollama? Did you set num_ctx?

They by default give agent like 4k tokens which is unusable for coding

rekram1-node avatar Dec 17 '25 15:12 rekram1-node

I do not see any docs at all for this project. Most github repo's have a docs directory that contain info.
For example ollama has a docs directory where it says how to set the num_ctx when using the api. https://github.com/ollama/ollama/blob/a013693f804b69f6af00f79623751277147972c5/docs/api.md?plain=1#L403

iplayfast avatar Dec 17 '25 18:12 iplayfast

Docs link in readme: https://github.com/sst/opencode?tab=readme-ov-file#documentation

Ollama docs: https://opencode.ai/docs/providers/#ollama

rekram1-node avatar Dec 17 '25 19:12 rekram1-node

In anycase, your point about the num_ctx can be solved in the api call. Maybe add a setting in the ollama parameters for context size.

iplayfast avatar Dec 17 '25 19:12 iplayfast

previously they didnt allow setting it in their openai compliant endpoints or in ai sdk provider they maintain, ill have to see if this was fixed there used to be some blockers for this to be done simply

rekram1-node avatar Dec 17 '25 20:12 rekram1-node

@rekram1-node FYI, I am currently using the vscode chatgpt-copilot extension for local, as opencode is not working locally. Initially, that extension was also broken, but updating the ollama provider to v2, and setting localhost:11434/api got it working. See https://github.com/feiskyer/chatgpt-copilot/pull/642

minger0 avatar Dec 17 '25 20:12 minger0

Ill check this out ty, @minger0 what package r u using? I think there are at least 3 packages

rekram1-node avatar Dec 17 '25 20:12 rekram1-node

@rekram1-node it is in the PR, see https://github.com/feiskyer/chatgpt-copilot/pull/642/files

minger0 avatar Dec 17 '25 20:12 minger0

sweet ty ill see if we can use here easily

rekram1-node avatar Dec 17 '25 21:12 rekram1-node

btw loving this project!

iplayfast avatar Dec 20 '25 20:12 iplayfast