Ollama custom provider error : agent configured model is not valid
Description
Any agents using ollama custom models raises a message : "agent configured model is not valid"
Then, if using it despite the warning, the following message is displayed:
ProviderModelNotFoundError: ProviderModelNotFoundError
subagents calls / tasks are not called.
The whole multi-agent system is consequently broken.
Of course, it was previously working with the same config.
Please help :)
Plugins
"mohak34/opencode-notifier@latest", "opencode-ignore", "opencode-beads", "@franlol/[email protected]", "[email protected]", "[email protected]", "@tarquinen/opencode-dcp@latest"
OpenCode version
1.1.23
Steps to reproduce
- open opencode
- select with tab an agent which model is based on custom ollama
Screenshot and/or share link
No response
Operating System
manjaro linux
Terminal
kitty
This issue might be a duplicate of existing issues. Please check:
- #7083: Using local Ollama models doesnt return any results (similar Ollama model configuration issues)
- #5623: Custom subagents with explicit model config throw ProviderModelNotFoundError (agent model configuration validation)
- #7958: ProviderModelNotFoundError for custom provider in GitHub Action (custom provider model validation failures)
- #5674: Custom OpenAI-compatible provider options not being passed to API calls (custom provider configuration issues)
- #6691: ProviderModelNotFoundError: minimax/MiniMax-M2.1 (similar model validation errors in different context)
Feel free to ignore if none of these address your specific case.
Yes probably related to them. It seems to be a problem with a wide range of side effects. still not working in 1.1.125
Ok. Problem solved.
all of a sudden, since probably version 1.1.123, it appears that in agent markdown conf:
model: ollama/gpt-oss:20b-opencode
does not work anymore.
Now doubles quotes are required. So replacing by
model: "ollama/gpt-oss:20b-opencode"
fixed the problem for me :)
Had to do it in all agent markdown files.
Hope it will help