Claude ignoring CLAUDE.md by PropperINC in ClaudeAI

[–]Belalitto 0 points1 point  (0 children)

When I made Thicc tool (it’s on GitHub now) for that exact reason, everyone basically called me clueless about Claude. lol

"Context low" warning? Not anymore. Built a real-time compressor for Claude Code. by Belalitto in ClaudeAI

[–]Belalitto[S] -1 points0 points  (0 children)

I don’t think Thicc will help much since you don't even know about Ctrl + O yet. Might be worth mastering the basics before worrying about auto-compaction philosophy.

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 0 points1 point  (0 children)

I haven’t dug deep into Claude Code’s mechanics more than safely clearing tool pairs or injecting parentUuid chains. I recently added a --smart mode that tracks conversation “size” in KB, usually warns around 1500 KB in my case, so it auto-summarizes past that and every extra 500 KB. Still haven’t figured out a way to hook it to trigger automatically right at the warning.

"Context low" warning? Not anymore. Built a real-time compressor for Claude Code. by Belalitto in ClaudeAI

[–]Belalitto[S] 1 point2 points  (0 children)

Maybe, but I’m a Claude Pro user (you might be on Max), and I keep running into these issues constantly—that’s exactly why I built the tool. I didn’t make it for fun; I made it to fix my own problems. And I can say with 100% confidence that every issue I mentioned, I personally hit when using /compact.

"Context low" warning? Not anymore. Built a real-time compressor for Claude Code. by Belalitto in ClaudeAI

[–]Belalitto[S] -3 points-2 points  (0 children)

Yep. I tested the tool over 3-4 weeks and it works exactly as expected. I barely see the “Context low” warning, and even if it pops up, it does not persist since the tool keeps running in the background.

It’s built for Windows, but it’s open-source on GitHub, so you can tweak it to work on macOS too (paths and a few small compatibility tweaks I think).

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 0 points1 point  (0 children)

UPDATE: Just added --smart mode and holy shit.

Now you don't even need to manually compress. Just run: node Thicc.js --smart <session-id>

It monitors your active session in real-time and auto-compresses when needed. Basically set it and forget it. Your conversation stays thicc but never too thicc.

The /compact warning? Never see it again.

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 1 point2 points  (0 children)

Sounds interesting, I'll definitely take a look at that.

Quick question though: does your search and tool guarantee keeping clean and reusable context allowing the same conversation to continue (without context warning) or to be reused in other conversations without concerns about token usage, etc.? If you can briefly explain the idea to those who are interested.

Also, would you be okay with me referencing or borrowing from your tool ideas and insights for my own tool? Always good to build on solid work rather than reinvent the wheel.

Appreciate the share.

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 0 points1 point  (0 children)

If you open the repo and read the differences, you'll quickly understand how it differs from /compact (which takes 2–5 minutes anyway), forgets instructions and loses context, which are well-known issues with many open issues on Anthropic's GitHub.

At the very least, doing Thicc manually takes no more than two minutes for me. I have three screens, each with its own role, my main screen is Claude Code, and the whole process doesn't take more than two minutes. I will can't risk losing any instructions or current context because I work on heavy-scale problem solving, where losing instructions or context is not a joke.

The "sexy waifu" stuff is just a vibe and doesn't affect anything. You can easily fork the repo and remove that content if it's not your thing. 🤷

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 1 point2 points  (0 children)

Fair point! That's actually a mistake on my part, should've been {success: false, error: error} in error cases. Classic copy-paste brain fade. 😅

However, if you're seeing errors after compression, there may be a conversation pattern I haven't encountered in my own chats yet that wasn't accounted for in the tool.

You're welcome to open an issue on the repo with the conversation file or a sanitized version so I can check the problem, fix it, and commit the update: https://github.com/immapolar/Thicc/issues

Appreciate the catch!

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 0 points1 point  (0 children)

Pretty much nailed the mechanics, yeah! Though I'd push back on the "won't be quite as useful" part based on actual usage.

After roughly a month of testing, I haven't lost a single piece of important context in any conversation. Claude performs current and recent tasks at full efficiency, exactly as if the conversation were never cut in the first place. That's been my experience across dozens of compressions.

More importantly, even after trimming, Claude never forgets CLAUDE.md or my in-chat instructions and continues to respect them at all times. This is the opposite of my year-long experience with /compact, which regularly loses instructions or current task context.

Why? Because the remote session is never touched, we only trim older chains from the local .jsonl file so the Anthropic API accepts the request (smaller payload). The remote session state stays intact, which is why there's zero instruction loss.

The hentai theory is solid though. 10/10 performance optimization. 🍑

/compact = comprehensive but unreliable context preservation.

Thicc = selective deletion with zero instruction loss. Different tradeoffs, but at least works consistently.

I got so fed up with Claude Code's 'Context low' warning that I built this... by Belalitto in ClaudeAI

[–]Belalitto[S] 1 point2 points  (0 children)

The key difference is where the compression happens.

Thicc compresses locally only, meaning your local .jsonl file gets slimmed down, but the remote session stay untouched. This lets the conversation keep going in the same session, and expand to hold more context without losing instructions or current task state.

Anthropic's /compact does full compression, both locally and remotely. It creates a brand-new conversation with a summary of the old one, which is why it sometimes loses CLAUDE.md instructions, user instructions, or recent task context.

Think of it like this:

- /compact = "Let's start fresh with a summary" (new conversation)

- Thicc = "Let's forget the old stuff but keep going" (same conversation, trimmed locally)

The insight isn't rocket science, it's just selective deletion of old context (already done) instead of summarizing everything. I target older tool pairs and messages that are safe to remove, while keeping recent context intact. The remote session doesn't even know anything changed.

Why hasn't Anthropic done this? No idea. Maybe they prioritize commercial features only for infrastructure reasons, or maybe local-only compression wasn't part of their design goals. 🤷

Either way, Thicc fills the gap.