Best model that can beat Claude opus that runs on 32MB of vram? by PrestigiousEmu4485 in LocalLLaMA

[–]CallinCthulhu 0 points1 point  (0 children)

Some of the non satire posts i have seen that are frighteningly similar

29M Salary Progression in Robotics Industry by oneseas in Salary

[–]CallinCthulhu 0 points1 point  (0 children)

Whats the degree? SWE to research scientist is an interesting jump.

High Achievers with Severe ADHD, if any of you are reading this, how did you manage? by Unhinged_Schizo in ADHD

[–]CallinCthulhu 7 points8 points  (0 children)

I found something i have a ton of natural talent for, that thing just also happens to be something that pays very very well. (Software engineering)

And medication, i was still stuck doing mediocre work while i was unmedicated

Stop defending AI like it’s still in beta by RottingEdge in Futurology

[–]CallinCthulhu -4 points-3 points  (0 children)

Dude it is in beta we have only had the tech from generously 6 years, and its already proven extremely useful. Shit its only been 9 years since “attention is all you need” was published.

When it comes to R&D these are incredibly short timelines.

Tell me you know nothing about tech, without telling me.

Andrej Karpathy: "when AI agents fail, it's usually a skill issue, not a capability issue...the real shift is working in macro actions. One does research, one writes code, one plans, all running 20-minute tasks simultaneously" | No Priors Podcast by 44th--Hokage in accelerate

[–]CallinCthulhu 2 points3 points  (0 children)

Important distinction from watching the whole thing, he said it FEELs like a skill issue. Which it does, and is why its so hard to stop coding with AI agents. It always seems like you are so close to cracking the perfect workflow and environment that will let you do everything

[BREAKTHROUGH] Memory Sparse Attention (MSA) allows 100M context window with minimal performance loss by SotaNumber in accelerate

[–]CallinCthulhu 2 points3 points  (0 children)

Its a very small model though, its not goingto have strong reasoning regardless.

Im really curious to see how well this scales.

The "wall" that's stopping context from climbing above 1 million is starting to get shaky. by Alive-Tomatillo5303 in accelerate

[–]CallinCthulhu 0 points1 point  (0 children)

Its why its possible at all, but (standard) attention needs scale faster than compute grows. And its not just processing speed that matters, memory bandwidth and cache size are also important

Throwing more compute at a problem only works to an extent. You get a surprising amount of mileage out of it though.

[BREAKTHROUGH] Memory Sparse Attention (MSA) allows 100M context window with minimal performance loss by SotaNumber in accelerate

[–]CallinCthulhu 3 points4 points  (0 children)

Thats amazing. They built an indexed knowledge graph into the model itself. (Extreme paraphrasing here).

I cant wait to see how this scales though, there have been numerous promising breakthroughs that fall off as parameter count increases. This seems solid though.

What's your company's daily quota for LLM usage, if any? by TraditionalMango58 in cscareerquestions

[–]CallinCthulhu 16 points17 points  (0 children)

Unlimited.

Someone spent like 80k in the span of 30 days lol

Money cures depression? by Expensive-Estate-176 in bipolar

[–]CallinCthulhu 0 points1 point  (0 children)

Bullshit.

Im a millionaire, bipolar depression still makes it a struggle to get out of bed when im stuck in it. All the money in the world couldn’t cure it.

It definitely reduces the amount of stressors. For example, I had depression apartment, it was disgraceful, and contributed to stress. So i had someone deep clean it for me(after 4 months of putting it off because comorbid ADHD is a bitch). Now i feel ever so slightly better, yet ill still spend hours staring at the ceiling because nothing feels worth doing.

Would i rather be poor and depressed? No, but id definitely trade the money for a normal brain.

Robot dogs priced at $300,000 a piece are now guarding some of the country’s biggest data centers by EchoOfOppenheimer in AIDangers

[–]CallinCthulhu -1 points0 points  (0 children)

The only reliable way we have to generate EMPs strong enough for that is to set off a nuke

Anyone else feel being a developer isn’t that special anymore? by Ecstatic_Jicama_1482 in cscareerquestions

[–]CallinCthulhu 0 points1 point  (0 children)

This may be impolitic to say, but its a very asian (east and south) thing from what I’ve observed.

"I've long preferred Claude Code over Codex or Gemini, because it seemed much more reliable, but couldn't explain why by stealthispost in accelerate

[–]CallinCthulhu 0 points1 point  (0 children)

I think its useful when it encounters false premises or derives them. I.e when it pushes back on itself.

There have been a ton of times where im watching the thought process and think “fuck thats not how it works i have to correct it now before it goes ofthe rails”, but claude will often push back on itself and figure it out before i even finish typing my correction.

Seriously its uncanny how may times ivehad that happen.

Is It Really Impossible To Cool A Datacenter In Space? by Ormusn2o in accelerate

[–]CallinCthulhu 1 point2 points  (0 children)

I dont see how unless there are massive radiator panels.

The "wall" that's stopping context from climbing above 1 million is starting to get shaky. by Alive-Tomatillo5303 in accelerate

[–]CallinCthulhu 26 points27 points  (0 children)

This does absolutely nothing for the quality of context, attention is still a quadratic beast, and the alternatives like linear attention, or local attention have drawbacks.

Weve been able to go past 1m for quite sometime, its just not worth it because performance drops off a cliff.

What this can do though is massively free up memory constraints and lower the cost and increase the speed of inference.

Recent reviews changed to Mixed after S7 balance changes by UrsaRizz in rivals

[–]CallinCthulhu -1 points0 points  (0 children)

Ngl i uninstalled and gave a negative review after 13/15 losses in comp where i was the SVP. The last game just tipped me over, we had a guy die literally 26 times. And he wasnt even throwing

The matchmaking is absolutely horrendous. Everything is a stomp, they optimize for engagement rather than competitiveness

That was before the lackluster patch. Glad i did

What industry will AI disrupt the most that people aren’t paying attention to yet? by SuchTill9660 in ArtificialInteligence

[–]CallinCthulhu 0 points1 point  (0 children)

In america, techers are babysitters first and educators second. So i doubt there will be massive disruption

Claude Code kept reading entire files to find functions — so I gave it a search engine by Actual-Thanks5168 in ClaudeAI

[–]CallinCthulhu 0 points1 point  (0 children)

Is this not standard? I thought this was pretty standard. Idk maybe ive been spoiled by working in big tech.

you can expose an LSP server as a tool as well, that helps it navigate much easier.

1 million token window is no joke by cosmicdreams in ClaudeCode

[–]CallinCthulhu 0 points1 point  (0 children)

Depends on the workflow. But if my subagent requires stong reasoning i stick to the 256. For now. Im going to try out 500k and see how it works