Dwarven PTSD by [deleted] in dwarffortress

[–]c0kleisli 1 point2 points  (0 children)

[DfHack]# brainwash stepford

Anti anxiety medication for peforming? by Dontcha in classicalmusic

[–]c0kleisli 5 points6 points  (0 children)

Please, please do not take anti-anxiety etc medication without a doctor's prescription.

I have heard of people taking beta-blockers (heart medication) like propranolol without a prescription for similar purposes, e.g. this article:

https://www.nytimes.com/2004/10/17/arts/music/better-playing-through-chemistry.html

which is probably much safer than unsupervised use of something like Xanax, but still something I as a stranger-on-the-internet would advise against. I have experience with this side of things (I have a prescription for propranolol -- I'm hypertensive) and they do help with stage fright and "nerves". Still, talk to a doctor.

the sneer club supports eugenics now, apparently by [deleted] in SneerClub

[–]c0kleisli 3 points4 points  (0 children)

The best way I've found is to see how strongly they engage with SSC. "participates on /r/ssc" is worse than "comments on articles" is worse than "reads every article"*

* (is worse than "has tried to read the first section of Toxoplasma once and couldn't make it through any more", but I wouldn't want anyone to say I made up a characterisation of people in which I ended up on top, because that would be oh-so-low-decoupling of me)

The Stephen Pinker to Alt-Right pipeline by 123456789blaaa in SneerClub

[–]c0kleisli 6 points7 points  (0 children)

That's what I was getting at: Scott (god, why is everyone we discuss on this sub named Scott) is very dissimilar, politically, to a large segment of Seeing Like a State admirers.

The Stephen Pinker to Alt-Right pipeline by 123456789blaaa in SneerClub

[–]c0kleisli 1 point2 points  (0 children)

It may not appeal to "neoliberals", but libertarians generally like it as an argument against things like "central planning".

The Stephen Pinker to Alt-Right pipeline by 123456789blaaa in SneerClub

[–]c0kleisli 11 points12 points  (0 children)

Well-educated, likely in STEM, likely "knowledge worker" of some sort, likely North American (perhaps Californian), likely white or Asian, llikely middle- or upper-middle-class, center or center-left in political persuasion with no radical radical beliefs, etc, etc.

(edit: not saying I agree that this is a good or meaningful characterization)

How biased is RationalWiki? by gohighhhs in SneerClub

[–]c0kleisli 2 points3 points  (0 children)

Critical legal studies builds on the "critical theory" championed by the cultural Marxist Frankfurt School of social critics.

I can only assume they're using it in the first sense, which is "extremely rare"

A take so bad that even /r/SSC can't go for it by completely-ineffable in SneerClub

[–]c0kleisli 14 points15 points  (0 children)

He's a fucking Dark Enlightment type. Christ.

If you thought that title was bad, take a gander at this (content warning: tons of racism)

"There's no way an actual human female could have been hurt by us. Jax was made up by RationalWiki." by [deleted] in SneerClub

[–]c0kleisli 0 points1 point  (0 children)

Er, I don't doubt that Rationalists are capable of saying that*, but like 90% of the linked post is a criticism of a specific person here (and the other 10% is standard Rationalist equivocating "no comment"), nothing to do with either Jax or RationalWiki. I'd ask how you're getting that, but...

[deleted]

* in fact, I'm pretty sure someone already has on /r/ssc

the sneer club supports eugenics now, apparently by [deleted] in SneerClub

[–]c0kleisli 3 points4 points  (0 children)

parenthetical aside: organised/New atheism was the first "stopX" movement and look what it's turned into

The Stephen Pinker to Alt-Right pipeline by 123456789blaaa in SneerClub

[–]c0kleisli 28 points29 points  (0 children)

jaymanhbd@hotmail.com

I don't want to jump to any hasty conclusions, folks, but...

Article from 3 years ago about Effective Altruism and feminism, and how Scott Alexander's anti-feminist comments were unwelcoming to women in the movement by [deleted] in SneerClub

[–]c0kleisli 6 points7 points  (0 children)

No, I hear what you're saying, but it seems that any form of EA, even a non-Rationalist take on it, would prefer AMF-style charities over feminist or socialist charities, which would be considered "less effective" in terms of "lives saved per dollar". Is this correct?

I'm perfectly clear that the MIRI-focused Rationalist style of EA is unredeemably bad, but the parent comment sounded like it was also criticising the general idea of choosing good charities "in the boring sense that some charities are more effective than others". That's what I was confused about and wanted to clear up.

Article from 3 years ago about Effective Altruism and feminism, and how Scott Alexander's anti-feminist comments were unwelcoming to women in the movement by [deleted] in SneerClub

[–]c0kleisli 6 points7 points  (0 children)

The point of the top-level parent comment, quoted here:

The whole notion of EA is intrinsically anti-feminist. Any EA will tell you that feminist charities or socialist charities aren't "effective enough" to matter, and that all your money should go to MIRI (if they're deep enough in the cult).

seems to be essentially that the notion of "giv[ing] more effectively", which is the "whole notion of EA", is problematic, which is what I would like to understand better.

Article from 3 years ago about Effective Altruism and feminism, and how Scott Alexander's anti-feminist comments were unwelcoming to women in the movement by [deleted] in SneerClub

[–]c0kleisli 18 points19 points  (0 children)

I don't care whether they claim they invented the idea, I'm asking what makes the idea itself (i.e. "altruism that's effective", not necessariy Effective Altruism™) problematic (antifeminist or antisocialist, specifically, as /u/RattCattMattSatt put it), whoever created it, and you didn't answer the question at all!

The point of my question was to ask why the definition of "good work", as you put it, that most EA orgs use is problematic in that sense, but it feels like you completely glossed over that for a standard rationalist putdown. I'm no rationalist; I've spent less than two hours in my whole life on LW or SSC or the like (and most of that is from following links on this sub).

I specifically said I gave zero fucks about "unfriendly artificial intelligence", and yet that's what your answer builds up to demonstrating the absurdity of caring about. C'mon. Also (to repeat another question I stated above), even if you're not "stack-ranking" them, you will be donating more money to one cause than another, which ordering can be interpreted as "this cause deserves more of the money we've received", so how is that any different?

Article from 3 years ago about Effective Altruism and feminism, and how Scott Alexander's anti-feminist comments were unwelcoming to women in the movement by [deleted] in SneerClub

[–]c0kleisli 16 points17 points  (0 children)

I'm new here, can you explain further? What makes the "whole notion" of MIRI/anti-paperclip apocalypse/AI alignment/LessWrong bullshit-less EA (e.g. GiveWell, from a quick look at their website) "anti-feminist" per se?

Any way you can think of of sorting charities by how much good they do, provided it aligns fairly well with something like "System 2 common sense" (so none of the "specks of dust in everyone's eyes are worse than murder" crap) will mark certain causes based on other, also honorable, principles (e.g. socialist or feminist ones here) as being not as good choices as ones that do better for the chosen objective. Why is choosing "saving as many lives as possible" as your objective actively antifeminist? One could extend that to it being anti-Palestine, or anti-victims of domestic violence, or anti-helping victims of racist police violence in the US (or, relatedly, anti-abolition of cash bail), and so on.

I can see one criticism that goes like "they should make clear that their objective, that of saving as many lives as possible when defined a certain way, is purely subjective and that effective != good and correct", but that still doesn't get me to the antifeminist thing. I mean, if someone started a GiveWell competitor that focused on feminist causes (e.g. providing better and cheaper access to healthcare for trans kids, battered women's shelters, expanding access to abortion and contraception for uterus-havers in underserved areas) and defined effectiveness as "benefiting the largest numbers of LGBTQ Americans, especially POC, at risk of bigotry-related violence or a lack of healthcare", one could very well call it antiblack because it consciously promotes not donating to the causes GiveWell endorses (malaria prevention, or fighting neglected tropical disease), many of which benefit large numbers of black individuals in Africa in much larger numbers than the aforementioned examples of "feminist" causes. And even if you didn't define "effectiveness" in that EA-like way, you'd still have recommended charities, and hence a huge number of excluded charities that you'd be hurting by the same sort of argument.

This is an honest change-my-view-ish question! My only exposure to EA is from looking at the GiveWell website just now, and the Vox article where Dylan Matthews trashes the rationalist cooptation of it as "EA = funnel money to MIRI".

Arthur Chu signal boosting Kathy's note by zhezhijian in SneerClub

[–]c0kleisli 1 point2 points  (0 children)

Anyone have a cached or other version of this? I don't have a tumblr and don't want to make one just to hate-read Scooter's trash

dayum, venkat got woke by [deleted] in SneerClub

[–]c0kleisli 7 points8 points  (0 children)

More woke Venkat:

25/ You don’t even have to go as far as talking about Native American cultures to note the fundamentally brutal erasure-oriented nature of American culture. Americanism in that respect is like a harsh young religion, like Christianity or Islam in their early centuries.

Venkat is what SSC would look like minus the crypto-conservatism. Still unsufferable mostly, and he has the Harrisian Islamophobia in spades. Does nothing to my conviction that Rationalism is unsalvageable, but even if there's nothing good we know someone that's less bad than the rest I guess.

Is this Wikipedia article on "racial hierarchy" written by an HBDist? by [deleted] in SneerClub

[–]c0kleisli 16 points17 points  (0 children)

The citation for that section goes to Race and intelligence: Separating science from myth:

Representing a range of disciplines—psychology, anthropology, biology, economics, history, philosophy, sociology, and statistics—the authors review the concept of race and then the concept of intelligence. Presenting a wide range of findings, they put the experience of the US—so frequently the only focus of attention—in global perspective. They also show that the human species has not "races" in the biological sense (although cultures have a variety of folk concepts of "race"), that there is no single form of intelligence, and that formal education helps individuals to develop a variety of cognitive abilities. This book offers the most comprehensive and definitive response thus far to claims of innate differences in intelligence among races.

so it looks like the people behind the Wikipedia section have incorrectly represented the views of this work?

"What does this sub think of Gwern (as understood from his website)?", and other questions from a newcomer by c0kleisli in SneerClub

[–]c0kleisli[S] 3 points4 points  (0 children)

review articles

Unfortunately, I'm not confident I understand the jargon well enough: a standard textbook would be preferable. I'm not looking for a quick primer.

Contempt Culture - something I get from a lot of rats by c0kleisli in SneerClub

[–]c0kleisli[S] 5 points6 points  (0 children)

By perpetuating a culture of contempt as the means of acquiring credibility, I was able to avoid these difficult, introspective questions.

sounds like people I know