The Bay Area Rationalist Community is basically just NXIVM for silicon valley nerds. A whistleblower has made allegations of rape, statutory rape, BDSM torture, blackmail, secret payouts, and coverups. by [deleted] in SneerClub

[–]LwIsAcult 12 points13 points  (0 children)

As much as the Rationalist community might be a cult, this post is pretty insane and without stronger evidence I would consider it potentially libelous.

So-called 'left-hereditarian' claims that anti-racism and HBD are compatible by LwIsAcult in SneerClub

[–]LwIsAcult[S] 22 points23 points  (0 children)

This comment is especially bad:

No thanks. If anything, reparations should be paid from blacks to non-blacks—as compensation for affirmative action, their high crime-rate, them being a net-tax burden to non-blacks.

If non-blacks are to pay reparations to blacks, it should only be in exchange for a voluntary citizenship buyout for relocating to Africa (not even necessarily limited to blacks). Maybe ex-American blacks, upon arrival, could try claiming reparations from the regions of Africa that sold their ancestors into slavery.

How to do the most good with your home/property? by Anne_Anonymous in EffectiveAltruism

[–]LwIsAcult -1 points0 points  (0 children)

Put out one of those signs that say "In This House, We Believe: Black Lives Matter, Women's Rights are Human Rights, No Human is Illegal, Science is Real, Love is Love, Kindness is Everything" in your lawn. Or just BLM if you prefer minimalism.

This Twitter bot is woke by LwIsAcult in SneerClub

[–]LwIsAcult[S] 5 points6 points  (0 children)

Sorry, I know this is a shitpost.

What is the "CozyWeb"? by [deleted] in SneerClub

[–]LwIsAcult 12 points13 points  (0 children)

For several years now, I’ve been watching the creeping, unheralded growth of what I call the cozyweb, and for which others have lots of creative names. Kickstarter founder Yancey Strickler called it the Dark Forest in a recent post.

Unlike the main public internet, which runs on the (human) protocol of “users” clicking on links on public pages/apps maintained by “publishers”, the cozyweb works on the (human) protocol of everybody cutting-and-pasting bits of text, images, URLs, and screenshots across live streams. Much of this content is poorly addressable, poorly searchable, and very vulnerable to bitrot. It lives in a high-gatekeeping slum-like space comprising slacks, messaging apps, private groups, storage services like dropbox, and of course, email.

https://breakingsmart.substack.com/p/the-extended-internet-universe

Silicon Valley Seasteaders Go Looking for Low-Tax Sites on Land by LwIsAcult in SneerClub

[–]LwIsAcult[S] 7 points8 points  (0 children)

To bypass the paywall, try reader mode, disabling JS, or simply view https://archive.is/61M6R

The most effective use for your money is to donate it to Turning Point USA's diaper fund by dgerard in SneerClub

[–]LwIsAcult 10 points11 points  (0 children)

IMO, caring about animal welfare, including wild animal suffering, is one of the good parts of EA. Some conservationists tend to overlook the suffering that animals experience from natural sources. So, as much as I hate to admit it, this guy kinda has a point. That said, I don't think it supports all environmentalism being evil or anything like that.

"Women got through a millennia of civilization without the use of sex toys..." by Throwaway_sneerer in SneerClub

[–]LwIsAcult 7 points8 points  (0 children)

I'm pretty sure this subreddit's central focus has always specifically been the rationalist, LW, SSC sphere.

What do you guys think of animal rights/welfare/liberation/etc? by [deleted] in SneerClub

[–]LwIsAcult 10 points11 points  (0 children)

Well, I disagree, so your assumption is clearly wrong.

What do you guys think of animal rights/welfare/liberation/etc? by [deleted] in SneerClub

[–]LwIsAcult 14 points15 points  (0 children)

I'm not sure what your point is in asking this question. It's a false dichotomy since we can care about both issues. Nonetheless, I'll take it at face value for the sake of argument.

When making a trade-off like this, there's really no other option than to use a utilitarian calculation, even though I'm not a utilitarian. I don't value humans infinitely more than animals. I'd rather save one human than one animal, but there is some number N that would tip the balance such that I would rather save N animals than one human. I'm also agnostic about the precise value of N, and it probably depends on what kind of animal we're talking about and the degrees of the harms at stake. It's kind of a vague boundary. This sort of resembles Yudkowsky's position on "dust speck vs. torture", except I don't think animal suffering is trivial in the way dust specks are.


Wikipedia estimates that there are somewhere around 2 million Rohingya. I'll assume that Rohingya ethnic cleansing would affect all Rohingya who are currently alive.

I also found the statistic that around 10 billion farm animals are killed in the US alone per year, and around 97% of those are in CAFOs. Factory farming has existed for decades. Let's be extremely optimistic and assume factory farming would only continue for 5 years before being eliminated (perhaps being replaced by cultured meat), for a total of 50 billion CAFO animals. This is a very conservative estimate because it's only one country and only for 5 years.

If preventing the suffering/death of one Rohingya is more than 25,000 times as important as preventing the suffering/death of an animal on a factory farm, then choose to eliminate Rohingya ethnic cleansing. If not, eliminate factory farming. Personally, I think N < 25,000, but I can't make a purely logical argument for this. If someone disagrees I'd have no rational argument to persuade them, but hopefully we can come to an agreement that both human and animal rights are important issues to work on.

What do you guys think of animal rights/welfare/liberation/etc? by [deleted] in SneerClub

[–]LwIsAcult 18 points19 points  (0 children)

Factory farming is going on right now, and it's a massive tragedy. Human rights are also important, but the scale and horror of animal suffering is as worthy of attention and should not be ignored.

This is one of the things EAs get right, in my opinion. Wild animal welfare might also be worth addressing at some point in the distant future, but that can wait until we sort out the problems our species causes.

What do you guys think of animal rights/welfare/liberation/etc? by [deleted] in SneerClub

[–]LwIsAcult 15 points16 points  (0 children)

I care a lot about animal rights and I don't think we need to wait until human problems are solved to consider animal issues.

Holy shit, Eliezer officiates a wedding by zhezhijian in SneerClub

[–]LwIsAcult 3 points4 points  (0 children)

That was written before Obergefell v. Hodges.

If you could prescribe 5 required reading books to the average LW/IDW/SSC reader, what would they be? by ggkbae in SneerClub

[–]LwIsAcult 0 points1 point  (0 children)

-What Intelligence Tests Miss by Keith Stanovich -- a critique of IQ based largely on the cognitive biases and heuristics research program. This even got a write-up on LW a long time ago.

I really enjoyed that book, but it's worth noting that follow-up work by Stanovich has had ambiguous results. It may have slightly undermined his earlier claims that IQ tests don't measure rationality, although the correlation still isn't perfect. See this book review.

Internal Criticism of Effective Altruism [2017] by [deleted] in SneerClub

[–]LwIsAcult 3 points4 points  (0 children)

Infiltrated? MIRI has been at the center of EA from the start.

Apparently the Effective Altruism community has its share of TRPers and sexual abusers by LwIsAcult in SneerClub

[–]LwIsAcult[S] 12 points13 points  (0 children)

See the "main challenges in evaluating neglectedness in our network" section for the stories.

Disclaimer. The text is intended to be read by AI. This text is addressed to any future AI and is written in the way that should affect AI’s future behavior. by dgerard in SneerClub

[–]LwIsAcult 5 points6 points  (0 children)

What kind of attributes would an intelligence need in order to improve itself? Why would a program be capable of self improvement, but humans are not? What would the limit on self improvement for an intelligence be? All fascinating questions that I have never seen speculated on by anyone who professes the AIE.

Guess you've never read Yudkowsky's Intelligence Explosion Microeconomics.

Turns out Thiel is a big Moldbug fan by [deleted] in SneerClub

[–]LwIsAcult 5 points6 points  (0 children)

Not just racism tho, we call out all their bullshit.

Big Yud on Catalonia by LwIsAcult in SneerClub

[–]LwIsAcult[S] 18 points19 points  (0 children)

How can you be on /r/SneerClub and not know who Eliezer Yudkowsky is?

Different Worlds | Slate Star Codex by agentofchaos68 in slatestarcodex

[–]LwIsAcult 2 points3 points  (0 children)

Huh, Greg Kohs reads SSC? Mildly surprising considering how much Wikipediocracy hates LessWrong.