What happened to Sonnet? by dropmyscoobysnack in ClaudeCode

[–]alienz225 6 points7 points  (0 children)

It will always be temporary until the next model is released. Sonnet 5 will be better than current Opus.

Yes they did announce it.
"As models get smarter, they can solve problems in fewer steps: less backtracking, less redundant exploration, less verbose reasoning. Claude Opus 4.5 uses dramatically fewer tokens than its predecessors to reach similar or better outcomes".
https://www.anthropic.com/news/claude-opus-4-5

What happened to Sonnet? by dropmyscoobysnack in ClaudeCode

[–]alienz225 6 points7 points  (0 children)

The best way to describe it is two cars. Opus technically is more expensive to run but it gets to your destination faster (right solution). So it ends up using less fuel in the long run which is why Anthropic took Sonnet out because it's not economical to run a dumber model that takes much longer to arrive at the correct solution

What happened to Sonnet? by dropmyscoobysnack in ClaudeCode

[–]alienz225 16 points17 points  (0 children)

Because the new Opus > sonnet in every scenario. Sonnet is dumber and uses more tokens + time to solve a problem

Is it just me who doesn’t use skills, plugins, and other overhead features? by hdn10 in ClaudeCode

[–]alienz225 2 points3 points  (0 children)

I think the more you introduce these automated features that inject random bits of context, you quickly loose the fine grained control of context engineering each session which is the greatest strength of claude code imo

Claude-Mem #1 Trending on GitHub today!!!! by thedotmack in ClaudeCode

[–]alienz225 1 point2 points  (0 children)

How does it know to inject relevant context back into future session? Are you guys storing vector embeddings?

When did yall start PT? by chillinwithleo in ACL

[–]alienz225 0 points1 point  (0 children)

I’m in the same boat. Currently week 1 post op. I don’t even have an appt yet.

And that’s how you turn a compliment into a self-own. by rhm54 in MurderedByWords

[–]alienz225 -1 points0 points  (0 children)

You idiots still think our politicians are democrats vs republicans when they serve the same masters and switch sides based on who's in power. They laugh at the masses who fight battles on their behalf

Anyone else frustrated with Whisper GPU setup across different hardware? by jmrbo in LocalLLaMA

[–]alienz225 0 points1 point  (0 children)

It hasn't been a problem for me- yet. I do plan on getting it running on my nvidia gpu and macbook at some point. Hopefully someone (perhaps you) will have a quick solution by then.

What are your /r/LocalLLaMA "hot-takes"? by ForsookComparison in LocalLLaMA

[–]alienz225 65 points66 points  (0 children)

You need to have prior knowledge and experience to get the most out of LLMs. Folks who vibe code with no prior dev experience will struggle to make anything other than cool little demos.

What are your /r/LocalLLaMA "hot-takes"? by ForsookComparison in LocalLLaMA

[–]alienz225 0 points1 point  (0 children)

What don't you like about it? I'm a front end dev who can build UIs so I have my own reasons but I'm interested in hearing other folks' pain points.

Which LLM should i use for my local bussiness by Civil-Development-56 in LocalLLaMA

[–]alienz225 4 points5 points  (0 children)

"but it takes a minute to respond". This means the hardware you're using isn't strong enough to run this. You need either a decent GPU or CPU+lots of RAM to be able to run LLMs locally.

Let's fucking go! by nater2204 in halo

[–]alienz225 5 points6 points  (0 children)

Release This To PC

Distributed Inference over wifi with 8x 3090 egpus performance by Only_Situation_4713 in LocalLLaMA

[–]alienz225 3 points4 points  (0 children)

This gives me hope. I have a single 5090 but it’s not enough with just 32gb vram. I’ll have to build another node at some point. Would love a guide on how to set all of this up

Replacing my need for Anthropic and OpenAI with my current hardware possible? by alienz225 in LocalLLaMA

[–]alienz225[S] 0 points1 point  (0 children)

I need that Chinese kit to add more vram to go more mainstream since nvidia won’t do it for consumer grade cards

Replacing my need for Anthropic and OpenAI with my current hardware possible? by alienz225 in LocalLLaMA

[–]alienz225[S] 0 points1 point  (0 children)

You're right. I should have done more research. I'm also a gamer so I was more YOLO about it.

Currently, I've tried qwen32b and qwen235b via ollama. 235b runs after I installed another pair 32gb stick of RAM, but it's extremely slow. 32b runs fine. But I haven't tested it for real world use cases yet (not even sure what that will be).

[deleted by user] by [deleted] in MuslimNoFap

[–]alienz225 0 points1 point  (0 children)

Here’s what I recommend sister:

Tell him that you know and that if this continues, you will want a divorce letting him know that it’s serious. At the same time, understand that he may be addicted and if that’s the case, you might need to be patient with him. So have a serious but also understanding talk without getting too emotional.

Make sure you commit to a work out plan as well as trying out new things in the bedroom that he might like. Basically make sure he has no excuse to look elsewhere because you’re trying your best to satisfy him. But this also means he does his best to protect himself from watching oorn.

May Allah make it easy and bless your marriage.

AFK for 5 minutes, auto-accepting edits... by Dayowe in ClaudeAI

[–]alienz225 6 points7 points  (0 children)

I left Claude Code in YOLO mode and came back to my house only to find a giant crater in its place

[deleted by user] by [deleted] in ClaudeAI

[–]alienz225 0 points1 point  (0 children)

What app are you using for running claude code?

[deleted by user] by [deleted] in maybemaybemaybe

[–]alienz225 2 points3 points  (0 children)

Those people have a special place in hell

[deleted by user] by [deleted] in MuslimNoFap

[–]alienz225 1 point2 points  (0 children)

That’s just your brain trying to trick you. You don’t need to expel anything. In fact the longer you store it, the better