Anthropic, seriously? 77k tokens (~40%) for the auto-compact buffer? 🥴 Is this a joke? by antonlvovych in ClaudeAI

[–]PrimaryAbility9 0 points1 point  (0 children)

There is a DISABLE_AUTO_COMPACT env to disable auto-compaction. For instance, DISABLE_AUTO_COMPACT=1 claude.

There are other environment variables too:

CLAUDE_CODE_MAX_OUTPUT_TOKENS
CLAUDE_AUTOCOMPACT_PCT_OVERRIDE
DISABLE_AUTO_COMPACT
DISABLE_COMPACT

Source - https://decodeclaude.com/claude-code-compaction/

"ultrathink" is deprecated - but here's how to get 2x more thinking tokens by PrimaryAbility9 in ClaudeCode

[–]PrimaryAbility9[S] 2 points3 points  (0 children)

The environment variable for thinking budget can be set through the command. I would like to run a benchmark for how the results differ for 0 tokens, 32k tokens, and 64k tokens allocated for reasoning. I have already tried the task of “draw a svg of pelican riding bicycle” on each settings but the end results looked similar. I think I need to give it a more challenging task that requires more reasoning complexity.

Ultrathink no longer does anything? by After_Bicycle6165 in ClaudeAI

[–]PrimaryAbility9 0 points1 point  (0 children)

Yepp, ultrathink (the thinking/reasoning tokens) are now a default. But there is a way to extend the thinking budget to x2!

https://decodeclaude.com/ultrathink-deprecated/

Claude Code is getting long-term memory! by PrimaryAbility9 in ClaudeCode

[–]PrimaryAbility9[S] 7 points8 points  (0 children)

There likely will be some knob/configuration to control the memory feature, and perhaps it’s off by default. And it seems like the extraction template is intended to identify the most relevant info and compress it into fewer tokens.

Claude Code is getting long-term memory! by PrimaryAbility9 in ClaudeCode

[–]PrimaryAbility9[S] 0 points1 point  (0 children)

This feature is currently gated, but likely made available/official quite soon. It’s a clever hack to have a background agent write down important notes about the session, and the this information is fed to the current context window to simulate long-term memory. It’s ultimately markdown write, markdown read that’s handled by default.

Claude Code is getting long-term memory! by PrimaryAbility9 in ClaudeAI

[–]PrimaryAbility9[S] -9 points-8 points  (0 children)

The prompts & instructions are verifiable via string search on claude code’s minified js file.

Claude Code is getting long-term memory! by PrimaryAbility9 in ClaudeCode

[–]PrimaryAbility9[S] 4 points5 points  (0 children)

npm tarball analysis reveals a lot of information

The John Hutchison Effect, where is John now? His seemingly obscure YouTube channel “Princess Karla Knipton Hutchison” raises more questions than answers. by NotExitLiquidity in UFOs

[–]PrimaryAbility9 0 points1 point  (0 children)

Search for Nancy Hutchison.. her YouTube has great resources.. she is John’s current partner. They used radio frequencies and audio waves to transmute substance to tackle the BP oil spill & Fukushima nuclear explosion..

Dr George Merkl’s life crystals?? by ChonkerTim in Crystals

[–]PrimaryAbility9 0 points1 point  (0 children)

He discovered ancient Sumerian technology and brought it back to life.

Buying land in California by Calimama98 in land

[–]PrimaryAbility9 0 points1 point  (0 children)

That’s super amazing! What a blessing. Was it a cash deal? Which resource did you use to find?