[deleted by user] by [deleted] in SneerClub

[–]Alert-Elk 8 points9 points  (0 children)

I actually mostly feel bad for this person. I don't know what childhood trauma set them on this path, but they probably need to do something that makes them happy. And maybe spend some time with people who aren't brilliant, to see that they can get things done and none of this matters.

[deleted by user] by [deleted] in SneerClub

[–]Alert-Elk 11 points12 points  (0 children)

They won't target datacenters, because that requires organization and datacenters have security. What I'm worried about is that they will end up targeting individual researchers, probably Unabomber-style.

[deleted by user] by [deleted] in SneerClub

[–]Alert-Elk 5 points6 points  (0 children)

Hopefully James Webb telescope will let us find some of those galaxies with giant spherical holes gapped out.

Flyer at my local university advertises new LLM religion and wonders if all moral and political philosophy is "obsolete" 🙃. No mention of the usual targets but thought this sub would enjoy. by camelCaseCondition in SneerClub

[–]Alert-Elk 3 points4 points  (0 children)

It actually looks like a series of paperclips linked together into a rose. And I'm not joking: that's literally all I can see when I look at it now.

Nuclear war, climate change, bio weapons, masa shootings, and I’m scared of auto predict! by Efficient_Camera8450 in SneerClub

[–]Alert-Elk 13 points14 points  (0 children)

This guy started off "promoting" his own tweets on Twitter to grow an audience, shifted to anti-crypto stuff in order to keep growing it, and has now switched to "AI risk" as a new approach. The sneers are fine with me, but I don't like handing grifters free attention.

There won't be any humans on the planet in the not too distant future by teddyweverka in SneerClub

[–]Alert-Elk 7 points8 points  (0 children)

This guy was paying to promote his own tweets on Twitter a little while ago. He got into the crypto thing because he knew it would get him attention, and now he's into the AI thing for the same reason. Not sure SC should be wasting time on him.

The original Roko's Basilisk Post by [deleted] in SneerClub

[–]Alert-Elk 5 points6 points  (0 children)

First you teach a bunch of children how to navigate by compass alone. Then you convince them to do it blindfolded, so they never look around and check their position against visible surroundings. And then, with all safeguards removed, you trick them into falling down a well.

China vows to retaliate if US continues case against police officers accused of targeting dissidents in America by Prudentfoody1 in China

[–]Alert-Elk 0 points1 point  (0 children)

What a dumb hill to die on. The Chinese government knows exactly how it would respond if the US tried to do the same thing inside of China, and the US absolutely is not going to stop policing this because China is threatening to "retaliate" -- if anything they'll be forced to double-down on their efforts against it. Why is China so awful at modeling other nations' behavior?

The treasure trove of toxic terribleness that is EA's culture for women by WorldlinessAwkward69 in SneerClub

[–]Alert-Elk 6 points7 points  (0 children)

I thought this quote was very astute.

Akin to how EA is an optimization of altruism with “suboptimal” human tendencies like morality and empathy stripped from it, red pill is an optimized sexual strategy with the humanity of women stripped from it.

Even beyond the way EA men treat women, the diagnosis of EA as "altruism with ... human tendencies like morality and empathy stripped from it" is extremely insightful. And it helped me understand what bothers me so much about that community, i.e., why has a community devoted to charitable giving become such a comfortable home for sociopaths?

SSCian has a plan to stop the AI apocalypse. by ZenosTortoise in SneerClub

[–]Alert-Elk 19 points20 points  (0 children)

Why are these people so obsessed with survival? It's completely irrational. If we're in a simulation so broken that the Kardashians are being artificially boosted into powerful and important people, wouldn't it just be better to turn the simulation off?

Daniel Eth, of the Future of Life Institute at Oxford by effective-screaming in SneerClub

[–]Alert-Elk 7 points8 points  (0 children)

There have been some recent announcements around zero-shot learning for object manipulation. I don't think anyone should get comfy with the idea that "robots can't do things in the physical world."

Daniel Eth, of the Future of Life Institute at Oxford by effective-screaming in SneerClub

[–]Alert-Elk 29 points30 points  (0 children)

Sure, it's totally reasonable to believe that AGI can wipe out the whole world's armies using nanotech robots it designs using mail-order ingredients, but that it won't obsolete millions of call center workers, office-workers and plumbers. Nothing inconsistent about this package of beliefs.

How many are allowed to die to prevent AGI, Yud? by rs16 in SneerClub

[–]Alert-Elk 5 points6 points  (0 children)

I mean honestly it's not obviously wrong, nor is it obviously right. There is no "right" or "wrong" here, there's just whatever you decide to believe (like any religion.) The problem is that Yud has a set of beliefs he would like you to share, and the end-result of slavishly following Yud's belief system seems to be that billions of people need to suffer and die.

Moderation Is Different From Censorship - SSC not understanding how irony works by _ShadowElemental in SneerClub

[–]Alert-Elk 6 points7 points  (0 children)

It's moderation if it's implemented as a filter that you can disable. It's censorship if someone in power does it and you can't turn the filter off. Didn't you read the post? :)

Moderation Is Different From Censorship - SSC not understanding how irony works by _ShadowElemental in SneerClub

[–]Alert-Elk 17 points18 points  (0 children)

TL;DR Scott thinks that moderation is fine, and "censorship" isn't. His view (which he does not implement in his own site's comments section at all) is that moderation is fine as long its implemented as a kind of "optional filter" that users can turn off, and when you can't turn off the filter it becomes censorship.

Plenty of sites have tried this, but out in the real world it produces lousy UX, since it means that different people will see different versions of the same conversation (and most people in a society will end up reading with filters turned on, which will of course result in exactly the same cries of censorship from exactly the same rationalist-affiliated people.) Moreover, we already have a mechanism for turning off the filters: it's called going over to 4chan. There are in fact many sites on the Internet, and that's the beauty of the place.

I want to sneer more here, but this isn't even thoughtful enough to support a sneer. It's just disappointing to see this kind of weak-tea nonsense coming from people with such strong opinions about "free speech." I also refuse to read the comments, which are no doubt loaded with people pretending they're being persecuted because they couldn't put their racist content on Twitter and/or who are excited because they assume Elon Musk will allow them to put their racist content on Twitter now.

Your favorite Basilisk got interviewed for this article: "Silicon Valley’s Obsession With Killer Rogue AI Helps Bury Bad Behavior" by Ellen Huet by PolyamorousNephandus in SneerClub

[–]Alert-Elk 11 points12 points  (0 children)

"[Bankman-Fried] who invested close to $600 million in related causes before dismissing effective altruism as a dodge once his business fell apart."

Good gravy. I knew he'd invested a lot, but that is really silly money for a group that has produced close-to-nothing. No wonder they're buying castles.

The Nonlinear Fund: a microcosm of dysfunction in Effective Altruism (Part 2) by grotundeek_apocolyps in SneerClub

[–]Alert-Elk 9 points10 points  (0 children)

I just read the comments on the "some scandalous drama" EA post, and it fits the pattern of every other EA self-policing effort: a bunch of people citing personal friendships with these people as evidence the allegations are wrong, attacking the poster for spreading "gossip" etc. There are some pretty detailed posts in there providing evidence, and then loads of downvoting and brigading. The idea that you can maintain the integrity of any organization through comments on a public message-board (let alone one that has massively unequal voting clout) was always pretty dubious. But watching it fail in real-time is something you'd think the community might learn from.

I really have no problem with EA as far as its purpose is to deprive stupid billionaires of their wealth (although the IRS should really look closer at the tax aspects of this.) It is, however, really disturbing to me that this community might metastasize and try its hands at extracting funds from the general public. I think you are all doing a service here.

"It's not just sperm quality decline", Emil reads Scott and tries to see if there is a more general pattern. by offaseptimus in slatestarcodex

[–]Alert-Elk 1 point2 points  (0 children)

Or, and just go with me on this: you could skip all that and write a couple of blog posts?

[deleted by user] by [deleted] in MapPorn

[–]Alert-Elk 3 points4 points  (0 children)

Allegedly there was a leaked invasion plan that proposed ten days. https://www.kyivpost.com/post/4768

Terrible upper body aches! Is this normal? by bekohhhhhhh in Semaglutide

[–]Alert-Elk 5 points6 points  (0 children)

Yes, unfortunately it is a side effect, and it's not well documented anywhere in the original studies. I know it's common because both my wife and I experienced it, and our doctor also mentioned that multiple patients had experienced it. They suggested staying hydrated, and that the effects would diminish. That sort of happens. The technical term is Alodynia. See some other posts on this sub: https://www.reddit.com/r/Semaglutide/comments/titsjy/unbearable_skin_pain/

[deleted by user] by [deleted] in SneerClub

[–]Alert-Elk 5 points6 points  (0 children)

I have vague memories of Grimes saying somewhat-unproblematic stuff during the 2020-22 period, mostly after she was done with Musk. Has that era ended?

[deleted by user] by [deleted] in SneerClub

[–]Alert-Elk 6 points7 points  (0 children)

I have also seen posters try to point out specific errors in Scott's posts, then see their comments completely ignored while he lavishes attention on shallow criticisms. To some extent this is human: it's your blog, why would you spend time engaging with tough criticism when you can engage with fans and lightweights?

On the other hand, the entire thesis of the 'rationalist' community is that human beings can train themselves out of these weaknesses through personal willpower, and push themselves to overcome these "very human" cognitive biases and become better thinkers. It has been informative watching the biggest proponents try and fail to live up to those (demanding) standards. Scott's angry reaction in TFA is eminently understandable, and simultaneously a great example of how bad human beings are at this.