claude-code icon indicating copy to clipboard operation
claude-code copied to clipboard

[BUG] Context Loss Bug in v2.1.3/2.1.4

Open konian71 opened this issue 1 month ago • 2 comments

Preflight Checklist

  • [x] I have searched existing issues and this hasn't been reported yet
  • [x] This is a single bug report (please file separate reports for different bugs)
  • [x] I am using the latest version of Claude Code

What's Wrong?

After a few prompts, Claude Code loses ALL context data. All memory files, agents, and rules show 0 tokens in /context output, causing API requests to fail.

What Should Happen?

Context should remain loaded during entire session. Memory files, agents, and rules should show their actual token counts (typically hundreds to thousands of tokens each, not 0).

Error Messages/Logs

/context output showing complete context loss:

❯ /context                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             
  ⎿   Context Usage                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    
     ⛀ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   claude-opus-4-5-20251101 · 0k/200k tokens (0%)                                                                                                                                                                                                                                                                                                                                                                                                                              
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛁ Skills: 53 tokens (0.0%)                                                                                                                                                                                                                                                                                                                                                                                                                                                  
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛶ Free space: 155k (77.5%)                                                                                                                                                                                                                                                                                                                                                                                                                                                  
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛝ Autocompact buffer: 45.0k tokens (22.5%)                                                                                                                                                                                                                                                                                                                                                                                                                                  
     
     Custom agents · /agents
     Project
     └ mariadb: 0 tokens
     └ mailer: 0 tokens
     └ javascript: 0 tokens
     └ model: 0 tokens
     └ browser: 0 tokens
     └ coder: 0 tokens
     └ docs: 0 tokens
     └ critical: 0 tokens
     └ views: 0 tokens
     └ style: 0 tokens
     └ analysis: 0 tokens
     
     Memory files · /memory
     └ /srv/CLAUDE.md: 0 tokens
     └ CLAUDE.md: 0 tokens
     └ .claude/rules/architecture.md: 0 tokens
     └ .claude/rules/basics.md: 0 tokens
     └ .claude/rules/css.md: 0 tokens
     └ .claude/rules/projektleiter.md: 0 tokens
     
     Skills · /skills
     Project
     └ todo: 29 tokens
     └ doit: 13 tokens
     └ status: 11 tokens

All memory files, agents, and rules show 0 tokens - complete context data loss.
API requests fail because no context is loaded.

Steps to Reproduce

  1. Use Claude Code 2.1.4 normally
  2. Wait until API requests start failing
  3. Run /context
  4. Observe: All memory files and agents show 0 tokens

Claude Model

Opus

Is this a regression?

Yes, this worked in a previous version

Last Working Version

2.0.58

Claude Code Version

2.1.4

Platform

Anthropic API

Operating System

Ubuntu/Debian Linux

Terminal/Shell

Other

Additional Information

Workaround: /clear temporarily fixes by reloading context, but only for few more prompts.

Permanent Fix: Downgrading to version 2.0.58 resolves the issue completely. Related to Issue #17453 (lag problem in 2.1.3). Both issues resolve by downgrading to 2.0.58.

konian71 avatar Jan 11 '26 21:01 konian71

Found 2 possible duplicate issues:

  1. https://github.com/anthropics/claude-code/issues/17453
  2. https://github.com/anthropics/claude-code/issues/15076

This issue will be automatically closed as a duplicate in 3 days.

  • If your issue is a duplicate, please close it and 👍 the existing issue instead
  • To prevent auto-closure, add a comment or 👎 this comment

🤖 Generated with Claude Code

github-actions[bot] avatar Jan 11 '26 22:01 github-actions[bot]

This is NOT a duplicate!

vs #17453:

  • #17453: Terminal UI becomes slow/laggy during typing and execution
  • This issue: Complete memory corruption - ALL context data shows 0 tokens
  • Different failure modes entirely

vs #15076: API returns 500 errors after cumulative context threshold - this is a SERVER-side error

This issue: LOCAL memory corruption where Claude Code loses all context data in memory BEFORE making API calls

Evidence:

  • #15076: API error 500, request_id: null (server fails)
  • This: /context shows 0 tokens locally (client memory loss)
  • Different symptoms, different failure points, different versions

#15076 is API/server issue. This is Claude Code client memory bug.

This bug is unique: Complete context data loss in Claude Code's memory management. /context shows ALL files with 0 tokens:

  • All CLAUDE.md files: 0 tokens
  • All agents: 0 tokens
  • All rules: 0 tokens

This is memory corruption in version 2.1.3/2.1.4, NOT performance degradation or API errors.

Please keep open - this needs separate investigation.

konian71 avatar Jan 11 '26 22:01 konian71

Hey @konian71 The same is still in 2.1.5. There is small detail that I never overflow the context -> its not needed to be compacted because I am not even reaching 150k/200k when my issues appears. The rest looks similar so I would keep your issue as a root one.

My personal input for the bug context:

What's Wrong?

During normal usage, after sending a message (especially with higher token counts ~70-80k+), Claude Code displays a "connection failed" error suggesting internet issues (internet is working fine). After pressing Esc to cancel, running /context shows the token count dropped from ~79k to ~2k - the entire conversation context is lost.

Debug logs show:

  1. Error streaming, falling back to non-streaming mode: Connection error. occurs
  2. Suspicious Summarizing all X messages (~0 tokens) entries appear - suggesting context is already empty when summarization triggers
  3. After user aborts with Esc: Error in non-streaming fallback: Request was aborted.
  4. Context is wiped instead of preserved

What Should Happen?

When a connection error occurs and the user cancels, the existing conversation context should be preserved. The error recovery path should not wipe the context.

Error Messages/Logs

Example from debug log (ffd1311b-d886-4026-9784-427e6076a6b9.txt):
  2026-01-11T17:18:59.766Z [DEBUG] Summarizing all 5 messages (~0 tokens)
  2026-01-11T17:19:00.439Z [ERROR] Error streaming, falling back to non-streaming mode: Connection error.
  2026-01-11T17:19:02.262Z [DEBUG] Summarizing all 6 messages (~0 tokens)
  2026-01-11T17:19:28.064Z [ERROR] Error streaming, falling back to non-streaming mode: Connection error.
  2026-01-11T17:19:42.879Z [DEBUG] Streaming aborted by user: Request was aborted.
  2026-01-11T17:19:42.879Z [ERROR] Error in non-streaming fallback: Request was aborted.
❯ /context 
  ⎿   Context Usage
     ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁   claude-opus-4-5-20251101 · 86k/200k tokens (43%)
     ⛀ ⛀ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁   ⛁ System prompt: 2.9k tokens (1.4%)
     ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁   ⛁ System tools: 16.5k tokens (8.3%)
     ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁ ⛁   ⛁ MCP tools: 2.1k tokens (1.1%)
     ⛁ ⛁ ⛁ ⛀ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛁ Custom agents: 911 tokens (0.5%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛁ Memory files: 373 tokens (0.2%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛁ Skills: 1.9k tokens (1.0%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛝ ⛝ ⛝   ⛁ Messages: 61.3k tokens (30.7%)
     ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝   ⛶ Free space: 69k (34.5%)
     ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝   ⛝ Autocompact buffer: 45.0k tokens (22.5%)

❯ HERE I had a message for 35k tokens. (200 > 35 + 86) ! 



❯ /context 
  ⎿   Context Usage
     ⛁ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   claude-opus-4-5-20251101 · 2k/200k tokens (1%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛁ Skills: 1.9k tokens (1.0%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛶ Free space: 153k (76.5%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶   ⛝ Autocompact buffer: 45.0k tokens (22.5%)
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶
     ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛶ ⛝ ⛝ ⛝
     ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝
     ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝ ⛝

     MCP tools · /mcp
     └ mcp__voicemode__converse: 0 tokens
     └ mcp__voicemode__service: 0 tokens
     └ mcp__hetzner__exec: 0 tokens
     └ mcp__hetzner__sudo-exec: 0 tokens

     Custom agents · /agents

     User
     └ docsmaster: 0 tokens
     └ session-developer: 0 tokens

     Memory files · /memory
     └ ~/.claude/CLAUDE.md: 0 tokens


❯ hey, you here ?                                                                                                                                                                                                  
● Yes, I'm here. Context was just compacted.                                                                                                                                                    
  
  Looking at that deployment log you shared - it actually succeeded. The log shows:                                                                                                               - Build completed successfully
  - All containers started                                                                                                    
  - "New container started" message at the end 

  The "Deployment is Failed" header might be stale UI or a different deployment. The es0g0o00oco8cg0w8oo848cw deployment we triggered earlier finished successfully.

  What would you like to do next?

❯ what are we doing?                                                                                                                                                                              ⎿  Request timed out. Check your internet connection and proxy settings
     Retrying in 9 seconds… (attempt 5/10)                                                                                                                                                      

Steps to Reproduce

  1. Have a conversation until context reaches ~70-80k tokens
  2. Send another message (just a normal flow, nothing specific or complex - normal every day prompts.)
  3. Wait for "connection failed" error to appear
  4. Press Esc to cancel
  5. Run /context - observe token count dropped to ~2k

Note: Intermittent - seems related to WSL2 networking or API timeouts

Claude Model

Opus

Is this a regression?

I don't know

Last Working Version

I Don't know when EXACTLY this started. 1 month ago is the safest bet, but maybe even in January

Claude Code Version

2.1.5

Platform

Anthropic API

Operating System

Other Linux

Terminal/Shell

WSL (Windows Subsystem for Linux)

Additional Information

  • Issue has been occurring for several days
    • Autocompact never triggers normally before this happens
    • The "0 tokens" summarization attempts suggest context is cleared during error recovery before user sees it

KindEmily avatar Jan 12 '26 17:01 KindEmily

Hi @KindEmily

Thanks for the detailed debug logs and for confirming this is still happening in 2.1.5!

Your observation about the "Summarizing all X messages (~0 tokens)" entries is really interesting - it proves the context is already wiped BEFORE the summarization even runs. That's a key detail for the devs.

I downgraded to 2.0.58 and have been running stable ever since - no more context loss, no more random connection errors. Not ideal, but at least a working workaround until Anthropic fixes this.

Which version are you currently using? Did you also downgrade or are you still on 2.1.5?

konian71 avatar Jan 17 '26 14:01 konian71

Update: v2.1.12 seems to have fixed this issue. Can confirm it's working stable now.

konian71 avatar Jan 18 '26 15:01 konian71

This issue has been automatically locked since it was closed and has not had any activity for 7 days. If you're experiencing a similar issue, please file a new issue and reference this one if it's relevant.

github-actions[bot] avatar Jan 27 '26 14:01 github-actions[bot]