[CRITICAL] Context auto-compact stuck at 0% - Tool Unusable
Environment
- Claude Code Version: [run
claude -v]: 1.0.71 (Claude Code) - OS: [your OS]: MacVersion 15.6
- Terminal: [your terminal] Warp
Bug Description
Context management completely broken - shows "Context left until auto-compact: 0%" persistently
Reproduction Steps
- Run
claudein any project - Message appears immediately
- /clear doesn't fix
- Reinstall doesn't fix
- Tool is unusable
Impact
- Cannot use Claude Code at all
- Blocks all development work
- Multiple users affected (refs: #1168, #2283, #3375, #4660)
Attempted Fixes
- Cleared cache
- Reinstalled
- Used /clear command
- All failed
Expected Resolution
Need immediate hotfix
Found 3 possible duplicate issues:
- https://github.com/anthropics/claude-code/issues/1168
- https://github.com/anthropics/claude-code/issues/3375
- https://github.com/anthropics/claude-code/issues/2283
If your issue is a duplicate, please close it and 👍 the existing issue instead.
This issue will be automatically closed as a duplicate in 3 days if there are no additional comments. To prevent auto-closure, please 👎 this comment.
🤖 Generated with Claude Code
I started working on a smaller code and almost immediately. I got hit with " Context low (9% remaining) · Run /compact to compact & continue"... I may have to look elsewhere if this is not fixed. Thanks!
https://cap.link/qwm6s7frg8ngnnn
new feature: microcompact did not help
====================================================== Previous Conversation Compacted ====================================================== ⏺ Compact summary (ctrl+r to expand) )<- COMPACT HAPPENS ⎿ Read production-criteria/FAR-140-CHECKIFEXISTS-FIX.json (885 lines) ⎿ Read production-criteria/FAR-140-FIXED-WORKFLOW.json (885 lines) ⎿ Read production-criteria/FAR-140-CONNECTION-FIX.json (885 lines) ⎿ Read production-criteria/FAR-140-FINAL-FIX.json (885 lines) ⎿ Read production-criteria/FAR-140-COMPLETE-FIX.json (885 lines) ⎿ Todo list read (7 items)
⏺ I'll now run comprehensive tests on the fixed workflow to verify all scenarios work correctly.
⏺ n8n-mcp - n8n_get_workflow (MCP)(id: "FwGMjOB0rb6OWdAR")
⎿ {
"success": true,
"data": {
… +1018 lines (ctrl+r to expand)
✳ Compacting conversation… (84s · ↓ 1.9k tokens · esc to interrupt**)<-COMPACT AGAIN?**
10th Aug - compaction issues happened again. Rendering it unusable.
Minutes after starting a new session, context left until auto-compact: 6% appears.
Still happening - Context left until auto-compact: 1% just 2-3 minutes of starting a session
Same issue, making Claude code currently unusable. I've gone as far as deleting ~/.claude with no change. Even on smaller code bases I get to 0% in minutes.
did the compact tool break or what? it is making it unusable. or did someone reduce the context window itself? I am having issues even when a new session is started, it runs out of context way faster then it ever did. backend context changes to the model?
this is Claudes response to this issue after doing research... pretty funny conclusion by himself. "You're absolutely right - this is a pretty significant design oversight. The research really paints a stark picture: The Compaction System Issues: No user control whatsoever:
Hardcoded 95% trigger threshold No disable option for auto-compact Black box algorithm with no transparency Users literally getting stuck in "infinite compaction loops"
Broken failure modes:
Context showing "169%" and compacting endlessly "102% context" loops that survive reinstallation Users spending "90%+ of time compacting" instead of working Tool becoming "completely unusable"
Expert consensus is damning:
"This almost never goes well on any remotely complicated change and you should basically just completely avoid compaction" Community advice is literally "avoid the core feature entirely"
What Should Have Been Done: ✅ Configurable thresholds (compact at 70%, 80%, 90%, etc.) ✅ Disable auto-compact option ✅ Algorithm selection (aggressive vs conservative) ✅ Smart file filtering during compaction ✅ Recovery mechanisms for compaction loops ✅ Context preservation settings The Workaround Ecosystem: It's telling that the community has created:
Multiple Docker isolation projects to avoid the issue Session management tools (claunch, etc.) File-based context systems to bypass conversation memory "Avoid compaction entirely" best practices
When users are building elaborate workarounds to avoid a core feature, that's a clear signal the feature is fundamentally broken. The fact that paying customers on $200/month MAX plans are getting stuck in unusable compaction loops with zero recourse is... not great product design. You've identified a legitimate gap where basic user control and configurability should have been table stakes from day one."
Thanks for the feedback here. Couple things I noticed from your pasted logs:
- Your mcp server seems to be returning a large response
- Post compaction it's reading a bunch of large files again. We have some file size checks here but I'm going to check again to make sure they're working as intended.
⎿ Read production-criteria/FAR-140-CHECKIFEXISTS-FIX.json (885 lines)
⎿ Read production-criteria/FAR-140-FIXED-WORKFLOW.json (885 lines)
⎿ Read production-criteria/FAR-140-CONNECTION-FIX.json (885 lines)
⎿ Read production-criteria/FAR-140-FINAL-FIX.json (885 lines)
⎿ Read production-criteria/FAR-140-COMPLETE-FIX.json (885 lines)
It's hard to tell deterministically what's going on however just from this. Next time this happens can you please submit a /bug report from Claude Code and send me the feedback id so I can look at your transcript / logs? Please make sure it's post compact + low context warning showing
was hoping the 1 M context would kick in. It did not. Stuck in compaction loop as it stands
@sid374 Hi Sid, glad to see someone from Anthropic is looking at it! │ │ │ Thank you for your report! │ │ Feedback ID: a00741c0-b96e-4499-aed5-dffa3a8734f3
Compaction kicks in close to 6% after another compaction just moments ago. Had to switch to codex.
https://share.cleanshot.com/zM9pCMX7 -> The irony of things. Compaction->Bug submitted->Compaction...
Context left until auto-compact: 10% , moments I started the CC
Using these settings have no effect:
- maxContextTokens: 200000 (increased from 150k)
- compactionThreshold: 0.95 (only compact at 95%)
- cleanupPeriodDays: 180 (6 months retention)
- maxConversationLength: 2000 (double the messages)
- aggressiveCompaction: false - Prevents aggressive trimming
- preserveCodeContext: true - Keeps code discussions
- retainSystemMessages: true - Preserves system context
- keepFileContents: false - Drops large file contents to save space
- prioritizeRecent: true - Keeps recent messages over old ones
Having this issue as well. I have completely removed Claude code and reinstalled. This happens with and without MCPs. This renders cc completely useless.
Had to use Codex which sucks but as least I am not greeted by death by compaction.. hope the 1M context will come in soon @sid374 any possible for fixes..... this affects a project I am working on.
https://streamable.com/6g1u7w My experience
@sid374 please could you raise this as critical? Lots of folks are having the same issues. I have to keep compacting and /clear does not help at all. It's EXTREMELY PAINFUL.
opencode is working hooked up to my claude pro max plan! https://opencode.ai/ @brandontan
Reduced git status from 224 → 143 lines (36% reduction)
-
Removed 81 problematic files from untracked status
-
Archived test artifacts that were consuming context
-
Updated .gitignore to prevent future accumulation
-
see whether this helps.. I doubt so as I tried claude code in a fresh repo and it happened as well
No go. Context low (11% remaining) · Run /compact to compact & continue hit right after.
Claude in Opencode is "dumber"
Context left until auto-compact: 0% remains a major issue and has a strong impact on my project deliverables
Thanks for the report and for bearing with us @brandontan. I've made some fixes and this should feel much better in tomorrow's release. Please keep an eye out and comment here if you're still running into this
no change @sid374 Context left until auto-compact: 1% ... after a couple of chats. It came back after a few chats. What are the chances of getting 1M tokens into CC?
│
│ Thank you for your report! │ │ Feedback ID: 6fba591c-18fc-4348-9ac9-cc957b2bc579 │ │
@brandontan this will be available in tomorrow's release (not released yet)
@sid374 Thanks a lot. Your guys' work does affect livelihoods.
Thanks @sid374 ! will monitor .85 before I close it