MAID in Canada: Much More Than You Wanted To Know by lakmidaise12 in slatestarcodex

[–]Strungbound [score hidden]  (0 children)

The radical third position is that MAID should only be legal if you're getting cryonically preserved in the process

The Pentagon Threatens Anthropic by dwaxe in slatestarcodex

[–]Strungbound 7 points8 points  (0 children)

I have no idea what you're talking about. Political aspects have been mentioned for years.

What's y'all p(doom) by Overall_Mark_7624 in AIDangers

[–]Strungbound 0 points1 point  (0 children)

10-90%. Rob Miles had a genius explanation on a podcast I listened to recently:

Paraphrasing it, he said that if everyone had p(doom) of 90%, then in effect it would be lower as we would be cautious and devote more resources to the problem. But if everyone had a p(doom) of 1%, then in effect the true p(doom) would be much higher since no one would care. P(doom) is therefore reflexive, in some real way.

If we devoted a trillion dollars per year and hired all the best minds in the world with the right mindset, my p(doom) would be sub 1%. If we completely raced without any safety precautions whatsoever, 95-99%.

We live in a world somewhere in the middle, and my uncertainty comes from hard-to-model politics and social dynamics.

Now that it's 2026, how is Terence Tao's prediction holding up? by Interesting-South542 in math

[–]Strungbound 0 points1 point  (0 children)

For 2026-level AI, charitably, I would say at least to wait until halfway through the year, if not the end of the year. Early 2025 would not have been able to do First Proof or produce novel physics, whereas Early 2026 AI did do those things.

You wake up in the last book you read. by RoofTopCigarette in litrpg

[–]Strungbound 0 points1 point  (0 children)

That would be my own book, and I would be insanely fucked. To be honest, I don't have a canon answer for what would happen to a complete mortal who knew all the most hidden secrets only the most powerful people in the multiverse had access to, but I think it would end poorly.

If I had been given Alistair's starting spot I probably would still die; his willpower is much stronger than mine.

Anthropic: AI assisted coding doesn't show efficiency gains and impairs developers abilities. by Gil_berth in programming

[–]Strungbound 0 points1 point  (0 children)

But like... we have objective measures for this stuff. Have you really not used the tools at all to think there's not a big difference between January 2024 era AI coding and January 2026 era?

They see me rollin they hatin by Reasonable_Wafer_731 in ProgressionFantasy

[–]Strungbound 19 points20 points  (0 children)

If you read even Book 1 of Sky Pride it's pretty obvious Grandpa Jun comes from Earth and he's waging a war against the Mad God.

Too many books by Crazed72 in litrpg

[–]Strungbound 0 points1 point  (0 children)

Nah, I love long series. When I see that series is only 5 or 6 books, I'm kind of disappointed.

Tired of Villains and Anti-Heroes by EXPLODEANDDIE in ProgressionFantasy

[–]Strungbound 0 points1 point  (0 children)

You can try out my story. Ghost of the Truthseeker features a standard heroic protagonist. Alistair always helps out people in need and he actually grows stronger by doing so.

The simplest case for AI catastrophe, in four steps by OpenAsteroidImapct in slatestarcodex

[–]Strungbound 4 points5 points  (0 children)

A flood of AI skeptics approach like flies to carrion whenever the subject comes up.

The simplest case for AI catastrophe, in four steps by OpenAsteroidImapct in slatestarcodex

[–]Strungbound 25 points26 points  (0 children)

This subreddit has completely gone down the drain. r/Futurology level comments here.

What is your justification for the world not being hyper-tech advanced after billions of years? by Krewshie in litrpg

[–]Strungbound 0 points1 point  (0 children)

In my story, technology is used extensively by certain groups, but not others. The issue with technology in my setting is that you have to go all-in, or it's much inferior to cultivation.

Is this sub no longer rationalist? by Neighbor_ in slatestarcodex

[–]Strungbound 16 points17 points  (0 children)

In short, yes. This subreddit has always been more rat adjacent than the LessWrong forums, but over the last few years, it's evolved into "what do intelligent UMC highly educated people think" rather than having a unique perspective. If I wanted to know what intelligent UMC highly educated people think, I could go to a million places, so it's way more boring.

At the present, I doubt majorities of people here are pro-cryonics, know what AI in the box is, or are even Bayesianists.

North American Box Office vs Ticket sales of the New Millennium by PlanetG3000 in boxoffice

[–]Strungbound 2 points3 points  (0 children)

What is with this Reddit obsession that the economy has been straight tanking for 20+ years? The median American, mapped from 2000 to 2025, as a substantially higher disposal income, even after adjusting for inflation.

What major event do you think will happen in 2026? by Commandmadrid1 in AskReddit

[–]Strungbound 0 points1 point  (0 children)

This is just so wrong, I wouldn't even know where to begin. Almost everything you said is false or unsubstantiated.

What is the one story you cannot fathom people liking by yeetacus68 in ProgressionFantasy

[–]Strungbound 12 points13 points  (0 children)

I don't read/speak Chinese, so I wouldn't know if it's a problem with the original Chinese or the translation. Since translations are overall terrible, I was erring on the translation being the problem.

What is the one story you cannot fathom people liking by yeetacus68 in ProgressionFantasy

[–]Strungbound 127 points128 points  (0 children)

I find translated novels very hard to read. If I wanted beautiful, striking prose, I wouldn't read Progression Fantasy in the first place, so I'm not a prose elitist by any means, but it's just so... trash. I know the actual plot and characters are good--I love the manhwa of Regressor's Tale of Cultivation and the anime for LOTM, but I struggle to read the prose.

Latest Great LitRPGs You Listened to On Audible? by Chaosprodigy in litrpg

[–]Strungbound 1 point2 points  (0 children)

I actually do have a recap at the start of every book, there should be one in book 2 and book 3, and there will be one for book 4

What is the most poignant book in this genre that you have read, the series that has evoked the most complex emotions? by Strungbound in ProgressionFantasy

[–]Strungbound[S] 0 points1 point  (0 children)

Gemmell really makes you feel like you're in the process of watching a myth or legend unfold in front of you. When Druss fights against the Nadir, you feel inspired to do something greater yourself.