opencode run hangs forever on API errors (breaks CLI/automation integrations)
Bug Description
When opencode run encounters an API error (e.g., 429 rate limit), it logs the error but never exits - it hangs indefinitely. This breaks any CLI tool or automation system that spawns opencode as a subprocess.
Reproduction
# With an expired/rate-limited account:
timeout 30 opencode run "what is 1+1"
# Result: hangs for 30 seconds, exit code 124 (timeout)
Expected Behavior
opencode should fail fast with a non-zero exit code and clear error message:
Error: API rate limited (429). Please check your subscription status.
Exit code: 1
Actual Behavior
opencode logs the error internally but continues running forever:
ERROR service=llm error={"statusCode":429,...}
# Then hangs indefinitely - no exit, no output to user
Impact
This breaks integrations like zeroshot which spawn CLI tools as subprocesses:
// zeroshot spawns opencode
exec('opencode run "task"', (error, stdout) => {
// This callback NEVER fires because opencode never exits
});
Workaround: We've added timeouts to our subprocess calls, but the real fix should be in opencode.
Environment
- opencode version: 1.1.16
- OS: Linux (Ubuntu 22.04)
- Auth: OpenAI OAuth
Suggested Fix
In the error handling path, ensure the process exits with non-zero code:
if (response.status === 429) {
console.error('Error: API rate limited. Check subscription status.');
process.exit(1); // <-- Currently missing
}
Or more generally: any unrecoverable API error should cause opencode run to exit, not hang.
This issue might be a duplicate of existing issues. Please check:
- #4506: opencode run hangs when encountering certain errors instead of exiting (specifically mentions CI context and hanging without proper error exit)
- #5888: Opencode Hangs when used as CLI tool (hangs forever when requests fail, no clear error messages)
- #3525: Quota Limit Exceeded Error is not handled properly (API errors like rate limits causing the app to hang instead of exiting)
Feel free to ignore if none of these address your specific case.
And a variant of this problem is when the LLM messes up a MCP tool call, it also doesn't return control the proper way.