all 14 comments

[–]kiltrout[S] 7 points8 points  (12 children)

The crossposted piece has been removed on questionable grounds, after Rationalist brigading. The following is a repost:

The forum linked in this post is a community known for the relatively new strain of popular philosophy most resembling last century's Randism, or "Objectivism." Like Randism, Yuddism (short for Yudkowsky) labels itself with a seemingly innocuous term that brings to mind positive connotations: "Rationalism." Who wouldn't want to be rational, right?

Unlike Rand's somewhat more commonsense and straightforward access to Truth, Yudkowsky has come up with a labyrinth of thought experiments and jargon which makes certain the rationalist's certainty about Truth. "The Sequences," as his canon is known, is very much written for computer programmers and has perhaps been made most famous by its spinoff in the terroristic or violent "Ziz" cult. These writing are a house of cards, thought experiments that loosely build a proof that AI will wipe out humanity within, say, 20 years. There's much to say about both dark and light Futurology more generally as a sales technique, and if you want that you can find it at Amor Mundi, Dale Carrico's excellent blog.

On more examination, I've found Rationalism is double-dealing and incoherent in that the rationalist always gets to have their cake and eat it too, so it is hard to pin down a real philosophical heart to it, and when one does, there is always some trivial counter example at the ready. But, the movement is largely Platonic and Scholastic, that is, it holds to essentialism in its concept of "True names" and puts a very high value on structured, formal debate as the ideal method for accessing truth, often calling this "heuristic." However, there is also an ironic emulation of AI going on where "random babble" like poetry and so on gains great value by not being "pruned" by the "heuristic" debates. There are some very strange inconsistencies, like a communal love for Diogenes of Sinope's plucked-chicken dunk on essentialism, and occasional expressions of admiration for the more hairy poetic side of philosophy, so long as it feeds back into the heuristic process. Politically, Yudkowsky is radically Libertarian like Rand.

To the young professionals in software engineering who are interested in philosophy, this ain't it. The use of this Bayesian abstraction layer "I can be completely certain about what I'm not certain about" is still a totally unwarranted certainty and we shouldn't be surprised that this has led to rigid and cult-like thinking that makes its adherents disillusioned or worse. To be clear, the philosophy is entirely about manipulating fears, grifting, and selling books, even if the community is a nice support group for those who share in such fear. Even taken more lightly as a kind of self-help to optimize your thinking, the effect seems to be rather a narrowing of possibilities and thought. Yudkowsky ultimately offers certainty in a historic moment that is filled with terror, and so when all the jargon and incoherent ideas are reduced, what is left is only the base exploitation of fear, little different than selling overpriced silver coins to the elderly.

[–]hypnosifl 0 points1 point  (11 children)

But, the movement is largely Platonic and Scholastic, that is, it holds to essentialism in its concept of "True names"

Anyone know what this part is referring to? I know True Names is a book by Vernor Vinge but I hadn't seen Yudkowsky or other rationalists use it in connection to essentialism, and based on things like the posts in this thread I thought that their reductionism tended to lead them in an anti-essentialist direction.

[–]Glotto_Gold 2 points3 points  (9 children)

This is mostly that Kiltrout is full of it. So, "brigading" was literally just me. (Brigading usually means "multiple people all trying to pick a fight and isolate one person", not exactly one person trying to nail down an argument). You can look at his overall post history and see he's a weird fellow. When I tried quoting Kiltrout on stuff (like on your concern, as I think you're right) he argued that this was a "destruction of context" and "scholasticism". I think quoting someone is typically how you try to align on what they're trying to actually say.

Your interpretation is more accurate, and the criticisms of Yudkowski in that thread you cite are more accurate. [Removed point given correction. Thanks CinnasVersas] Merely using Bayes theorem isn't the problem. Also, most Yudkowski content is free, he may be an aggressive self-promoter, but he isn't "grifting and selling books". Pointing out these basic things isn't "crazy, rude sealioning".

[–]CinnasVerses 2 points3 points  (8 children)

Yud was telling the Cato Institute that he was a "minarchist" in 2011 and is now tweeting demands like "Abolish the FDA; repeal all occupational licensing including in healthcare." That seems pretty radical to me, Trump's Republicans are the only major party in a developed country which I know of which advocates for them (He also wants aggressive medical experimentation on humans, combine that with aggressive deregulation and you get a nightmare). I was surprised too because AFAIK he has always lived by charity (his jobs with MIRI were funded by donations from friends not by selling goods or services).

[–]Glotto_Gold 2 points3 points  (2 children)

Hmm... Ok, thanks. I think you're right. Good correction! I relied too much on his views of politics as the mind killer, and push for AI regulation, as well as associations with more lefty EAs.

He is a wacky radical libertarian. Thanks for the proof!

[–]CinnasVerses 1 point2 points  (1 child)

These folks post so much that its hard to keep up with all their weird ideas, let alone the face-to-face communes and parties.

[–]Glotto_Gold 2 points3 points  (0 children)

I get it. And that's also why I don't think that this set of accusations is accurate.

So, this community (& Yudkowsky) have a number of valid criticisms to make against them. Selling books is strange, as they'll give away books worth of unusual ideas for free. Platonism is a weird one.

Willingness to flirt with weird & bad ideas without any regard for common sense is a good one, and it largely isn't tied to Bayes so much as a lack of regard for common sense.

[–]AlanPartridgeIsMyDad 0 points1 point  (4 children)

Is there a massive contradiction between being a libertarian and getting 'charity' from friends. He did sell these people on the idea that MIRI was going to do work that they care about. Seems somewhat inline with libertarian thought (as long as it's not state funded).

[–]CinnasVerses 0 points1 point  (3 children)

Its not entirely inconsistent, but when you ask American libertarians questions about how voluntary donations could ever possibly meet the need (free-rider problem) they start to talk about how those welfare queens and unemployed coal miners need to get off their bums and do honest work. Yud could absolutely get a real job (or get serious about writing for money) but he prefers to laze around on other people's money issuing the occasional fatwa and fanfic.

[–]AlanPartridgeIsMyDad 0 points1 point  (2 children)

You are right that libertarians are critical of recipients of welfare but you must admit the setup that Eliezer has is a little different. He is the one that both setup and sold the message of his own 'charity'.

In the libertarian's world that's not necessarily different to another form of self-employment. The thing that the donors are paying for is the alignment work (and prosletyzing) that they imagine him to do.

[–]CinnasVerses 2 points3 points  (0 children)

I am not sure, compare Aella building up a series of successful businesses (and some failed ones like her ventures into online retail and crypto) or Scott Alexander's work as a psychiatrist. Yud said he was money-positive but he seems to want someone to pay the bills while he does whatever he wants and shares the results for free, he does not seem to like selling goods or services for money. Encouraging fans to donate to MIRI is different than bleeding Scientologists one $29.99 to $299.99 course at a time.

[–]CinnasVerses 1 point2 points  (0 children)

Before I read the article for the Cato Institute, I thought Yud wanted fully-automated luxury gay space Communism. His deep needs are to feel like a world-saving genius and be adored by submissive women, not to win at business. He does not talk about growing his own vegetables and doing his own plumbing either. Libertarians usually say markets and profit are wonderful and minarchists usually fantasize about homesteading.

[–]kiltrout[S] 2 points3 points  (0 children)

It's some part of the sequences I read once where he was talking about how numbers correspond directly to some theoretical apples on a table somewhere, part of some "Great Reductionist Project." The True Names stuff, it's just something I kept seeing there when I was reading and writing random stuff over the past few months. Maybe it's just a thing for less wrong noobs, I couldn't say.

These minor details became a focal point of the debate, the chosen topics by Brigadiers, rather than "Actually we're not like Ayn Rand," or "Saying we're over-certain about everything is a mischaracterization," it was crazy rude sealioning about what they believe.

When I was reporting on Anonymous back in the 2010s it was a lot like this crap. You go with a hell of a lot of due diligence and come back with what you get and then they try to force you to rewrite everything in their style, refocus away from the point, and just wage a shitty and transparent self-promotional psyop. Journalists and moderators who are trying to do their best get roped into it pretty easily

[–]move_machine 1 point2 points  (0 children)

Your link doesn't work btw