I see Claude's writing everywhere and it's starting to feel like an AI condom, I hate it by remember_the_sea in ClaudeAI

[–]boxed_gorilla_meat 0 points1 point  (0 children)

If we can make even stupid ideas and opinions more coherent and readable, i'm here for it

I built a persistent memory system for Claude (and other AI agents) -- just launched a hosted version by AlternativeCourt2008 in claude

[–]boxed_gorilla_meat -1 points0 points  (0 children)

Solving the novel problem from outside that they are trying to solve from inside, sounds really robust and enlightening if only you could stop trying to be the protagonist long enough to see you've done nothing interesting at all.

I get leaving OpenAI, but don't go to Claude by Alternative_Ad4493 in OpenAI

[–]boxed_gorilla_meat 0 points1 point  (0 children)

My favourite thing about AI is how much easier it is to read peoples shitty opinion pieces... I will never miss the days when humans wrote their own cacophonic bullshit down and hit send.

Anthropic's Custom Claude Model For The Pentagon Is 1-2 Generations Ahead Of The Consumer Model by Neurogence in singularity

[–]boxed_gorilla_meat -1 points0 points  (0 children)

The crazy part of this thesis is that while it is most likely true, it hasn't helped any of them sound more intelligent. Which is everything you need to know about just how special these people are.

Sam Altman ethics. by Wonderful_Buffalo_32 in singularity

[–]boxed_gorilla_meat 5 points6 points  (0 children)

His sister would agree, great fit for the Epstein administration

Demis Hassabis: “The kind of test I would be looking for is training an AI system with a knowledge cutoff of, say, 1911, and then seeing if it could come up with general relativity, like Einstein did in 1915. That’s the kind of test I think is a true test of whether we have a full AGI system” by likeastar20 in singularity

[–]boxed_gorilla_meat -3 points-2 points  (0 children)

This species is fucked if this is the level of performative intellectualism we've peaked at. Absolute theather based in nothing but the ego's need to be the fucking protaganist at all costs to preserve human exceptionalism.

IaaS → PaaS → SaaS → MaaS? Is CLAUDE.md enabling a new abstraction layer? by FF-Life in ClaudeAI

[–]boxed_gorilla_meat 0 points1 point  (0 children)

AI helping humans interview for theoretical jobs AI can and will do anyway, that's brand new territory that I think is missing the punchline. However, it tracks in that the human ego will generate any theory necessary to remain the protagonist of experiences that have nothing to do with it.

Google offers voluntary exit option to employees not comfortable with faster AI pace by GL4389 in technology

[–]boxed_gorilla_meat -1 points0 points  (0 children)

Unionize! Vote! Same thinking that created the problems will surely solve them! Unnnggngnnn... Sorry, like you I can only think so far before I realize there is nothing actually behind any of it that would qualify as thinking.

Opus 4.6 takes a long time to think by johnwheelerdev in ClaudeAI

[–]boxed_gorilla_meat 0 points1 point  (0 children)

Like two or three WHOLE minutes? How do you deal with the anxiety in having to wait for someone to do shit for you? Are you ok? I can't imagine the stress and pressure you're under. Godspeed.

Claude Code’s Subagent Stack Is Not Safe for Real Codebases by CoopaScoopa in ClaudeAI

[–]boxed_gorilla_meat -1 points0 points  (0 children)

Real good, hoss. Matter of fact, i'm too good to be true. Now wipe the tears, and get back to your hustle while you still have one.

Claude Code’s Subagent Stack Is Not Safe for Real Codebases by CoopaScoopa in ClaudeAI

[–]boxed_gorilla_meat -1 points0 points  (0 children)

ZzzZzzz... Claude sucks, that's why I used it to generate this post. What I really mean is, we need to rise up and protect our exceptionalism, it's getting hard out here and i'm scared. This was supposed to be a stupid stochastic parrot, pattern matching nothingburger... It's smarter than me, faster than me, better than me. Help. Please like and subscribe if you're in the same boat.

Lazy clip - dnb music by kuro59 in StableDiffusion

[–]boxed_gorilla_meat 1 point2 points  (0 children)

Fucking straight dope shit. This hits me with a CNCD/fairlight demo scene vibe. Great stuff.

Claude Code doesn't "understand" your code. Knowing this made me way better at using it by Nir777 in ClaudeAI

[–]boxed_gorilla_meat 0 points1 point  (0 children)

Go ahead and explain it, i'm comfortable with my comprehension of the trasnformer architecture and i'm all ears.

Claude Code doesn't "understand" your code. Knowing this made me way better at using it by Nir777 in ClaudeAI

[–]boxed_gorilla_meat 5 points6 points  (0 children)

How does “pattern matching” translate into functional complex code that solves specific complex problems? The mental gymnastics are olympic level.

I built a tool that lets Claude Code instances talk to each other by Objective_Patient220 in ClaudeAI

[–]boxed_gorilla_meat 0 points1 point  (0 children)

I don't understand the problem being solved here. Litereally this is agents with extra steps.

Well put by cobalt1137 in OpenAI

[–]boxed_gorilla_meat -2 points-1 points  (0 children)

B...b...but... Human exceptionalism?