opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Support per rule model fallbacks for outages and credit depletion

Open esafak opened this issue 5 months ago • 9 comments

Support the specification of multiple models per mode, in order of preference like so:

"mode": {
    "build": { "models": ["google/gemini-2.5-pro", "openai/o4-mini"] }
}

esafak avatar Jul 24 '25 00:07 esafak

I'm also quite interested in this actually, but I've nothing else to add outside of that unfortunately. The idea of being able to burn up one subscription to the end and fall back to another seems quite appealing- it'd help me make the most of my limits.

Sewer56 avatar Sep 30 '25 16:09 Sewer56

This would be very useful, I use OpenCode with subscriptions to GitHub Copilot, Claude Code, and GLM. When I hit the usage limits on any of them mid-task, the workflow gets interrupted. Having fallback models would make the tool more reliable and efficient.

OscSer avatar Jan 02 '26 20:01 OscSer

this would definitely be a gamechanger especially with recent (un)developments r.e claude and their gatekeeping

smashah avatar Jan 11 '26 03:01 smashah

Another use-case for this feature would be using the same model from different providers. For example, zAI's GLM-4.7 is offered by Cerebras at insane speeds, but the token-per-minute limits are exhausted pretty quickly. If we could have that by default, but then fallback onto Zen's GLM-4.7 and then after the minute go back to Cerebras, this would make the experience extremely smooth.

pdobrev avatar Jan 11 '26 08:01 pdobrev

Very useful feature

manascb1344 avatar Jan 14 '26 04:01 manascb1344

Hey! I’m looking into implementing this. I’ve poked around the repo and the plan seems pretty straightforward:

  1. Update PermissionObject in config.ts so models can take an array.
  2. In processor.ts, wrap the LLM calls in a loop that catches 429s or credit depletion errors.
  3. If a model hits a wall, it just cycles to the next one in the list and retries the request.

I'm ready to start on a PR for this if the approach looks good to you guys.

manascb1344 avatar Jan 14 '26 04:01 manascb1344

Looks sensible to me. @manascb1344 It's more less what I expect.

Sewer56 avatar Jan 14 '26 20:01 Sewer56

+1 really need this!

elijahdev0 avatar Jan 15 '26 19:01 elijahdev0

I've made the PR for this!

manascb1344 avatar Jan 16 '26 10:01 manascb1344