opencode icon indicating copy to clipboard operation
opencode copied to clipboard

[Bug] JSON Parse Error with Zhipu GLM-4.7: Stream chunks are concatenated incorrectly

Open yangshenlong opened this issue 1 month ago • 7 comments

Description

Description: I encountered a JSON Parse Error when using the Zhipu GLM-4.7 model. It seems like the stream parser fails to handle the boundary between two data chunks correctly, leading to invalid JSON.

The Error: The error message shows that the end of the previous message and the start of the next data: chunk are stuck together without proper separation.

Error Log / Evidence:

AI_JSONParseError: JSON parsing failed: Text: ... "role":"assistant"data: {"id": ... Error message: JSON Parse error: Expected '}'

As seen in the log, the string "role":"assistant" is immediately followed by data:, missing the closing brace } and likely a newline character.

Environment:

Model: Zhipu GLM-4.7 (configured via zhipu/glm-4.7)

Plugin: OMO / OpenCode

Image

Plugins

Z.ai/GLM-4.7 ; OhMyOpencode

OpenCode version

1.0.152

Steps to reproduce

  1. Configure the OMO plugin to use Zhipu GLM-4.7 as the default model for the 'Sisyphus' agent (or set it in the model configuration).
  2. Open the OMO interface and select the 'Sisyphus' mode.
  3. Input a task to trigger the agent (e.g., "Analyze this project" or "Write a simple snake game").
  4. Wait for the model to start streaming the response.
  5. The AI_JSONParseError occurs immediately when the model returns a response, interrupting the stream.

Screenshot and/or share link

No response

Operating System

window11

Terminal

WindowsTerminal in Trae

yangshenlong avatar Jan 10 '26 19:01 yangshenlong

This issue might be a duplicate of existing issues. Please check:

  • #2188: JSON parsing failed: Text - shows the same JSON parsing error pattern with incomplete JSON (Expected '}')
  • #5890: Incomplete JSON when writing out files - reports similar incomplete JSON issues when handling model responses
  • #2840: Empty SSE Events Leak Through Opencode Stream Processor - related to SSE stream handling and data concatenation
  • #3596: SSE Stream Bug: Out-of-Order thinking_delta via LiteLLM → AWS Bedrock - related to stream chunk ordering issues

Feel free to ignore if none of these address your specific case.

github-actions[bot] avatar Jan 10 '26 19:01 github-actions[bot]

I think this is the plugin failign to send back valid stuff when proxied through it perhaps?

rekram1-node avatar Jan 11 '26 03:01 rekram1-node

I think this is the plugin failign to send back valid stuff when proxied through it perhaps?我怀疑这可能是插件在通过代理时无法返回有效数据导致的?

Thanks for the response!

To help you narrow down the issue, I'd like to clarify my setup details. I think this might be a specific parsing issue rather than a general network failure:

  1. Custom API Key: I am using my own API Key (configured in the zhipu provider), NOT the free built-in service provided by OpenCode. So the request should be going directly from my client to Zhipu.
  2. Proxy Environment: I am indeed using a local network proxy (VPN).
  3. Control Group (Crucial): Under the exact same proxy and custom API key settings, the Minimax model works perfectly fine. The connection is stable and the response is parsed correctly.

The JSON Parse Error only happens with GLM-4.7. This makes me suspect that the issue is specific to how the plugin handles GLM-4.7's data stream (perhaps the plugin isn't handling concatenated JSON chunks/sticky packets correctly), while Minimax's stream format is handled fine.

(P.S. English is not my first language, so please excuse any unnatural phrasing.

yangshenlong avatar Jan 11 '26 04:01 yangshenlong

尝试用最新版本1.1.13再试试 , 1.0.152是旧版本

DarkPocket avatar Jan 12 '26 03:01 DarkPocket

Oh guys make sure you are on the latest version I think that's the issue here

rekram1-node avatar Jan 12 '26 03:01 rekram1-node

Currently, I am using version 1.1.13, and when using it with Oh My Opencode and connecting via /connect with OpenAI OAuth, the GPT series models frequently encounter the following situation.

Image

APE-147 avatar Jan 12 '26 13:01 APE-147

Are you running through some proxy or something that isnt handling streams correctly?

rekram1-node avatar Jan 12 '26 23:01 rekram1-node

After the version update, I uninstalled the opencode-openai-codex-auth plugin and switched to using /connect for OAuth authentication, which should not involve a proxy in between.

APE-147 avatar Jan 13 '26 04:01 APE-147

which should not involve a proxy in between.

I meant network proxies

rekram1-node avatar Jan 14 '26 16:01 rekram1-node

Thank you for your patient reply. Based on my local environment, the system proxy on my machine: not enabled. scutil --proxy is empty; and I am directly connecting via /connect Oauth, there should be no remote reverse proxy.

APE-147 avatar Jan 15 '26 03:01 APE-147