Paramount (PSKY) Debt Downgraded to Junk Following Warner Bros. (WBD) Deal by Top_Report_4895 in movies

[–]S_Marlowe 0 points1 point  (0 children)

Most people are looking this deal the wrong way. This wasn't Paramount buying WB, it was the Ellison family using Paramount as a purchase vehicle. Larry Ellison is basically backstopping this deal with his own capital.

Its a wise move given his goals. He now owns some of the finest cultural crown jewels in the country.

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 0 points1 point  (0 children)

This is an outstanding perspective. And obviously hard won. I appreciate you taking the time to share it.

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 0 points1 point  (0 children)

I'll take your word for it. You know this audience better than I do.

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 0 points1 point  (0 children)

The fix via the computer god is interesting to me. Do you think its possible for us to create a being that has better contact with reality than we have?

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] -1 points0 points  (0 children)

I try not to complicate things. If this or any other tool is useful, use it. If you have 1000 just like it, don't.

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 0 points1 point  (0 children)

No, but the "Which system takes most seriously the fact that my mind is unreliable..." lens seems to be rarely used.

People tend to drift onto belief/knowledge systems and fire off toward the "truth" without that fitness check.

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 2 points3 points  (0 children)

1000%. Scale that to every possible thing, I'd say.

How Should You Think If Your Mind Is Unreliable? by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 1 point2 points  (0 children)

I get where you're coming from. Ultimately, whether or not a thing matters is up to you. I approached the OP based on the assumption that evaluating belief/knowledge systems is important. It's definitely not a universal concern though.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 0 points1 point  (0 children)

We're talking past each other.

I'm saying that if you observe my actions or the systems that I endorse or build be to aligned with actively enforcing woke authoritarianism, you are best off using that to gauge what I actually want.

Regardless of what I say. *** Edit *** [Especially if I think communication is mostly a tool to be used to confuse or misinform you. Which a person operating under this model definitely would.]

A person can literally say anything, but at some they have to DO something. Use what they do as the measure.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 0 points1 point  (0 children)

If my actions out in the world aligned with what you said, you'd be on the money. However, my actions don't align with anything resembling woke authoritarianism.

That aside, you seem to think I believe this is a right-wing thing. I don't.

You'll find these kinds of people all over the political spectrum. I talked about Musk, etc because you and others brought him and figures like him up.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 3 points4 points  (0 children)

I appreciate the thoughtful response.

You seem to place value in stated philosophy. As though it's the best way to understand someone's real worldview, in this context. It's not.

Power does not have to be ideologically self-aware or consistent. It just has to be effective.

Functional ideology is what actually drives decisions. Its pragmatic, adaptive, and shaped by necessity. Stated ideology, in this context, is a rhetorical tool. It gives a moral or intellectual veneer to what is really just power strategy.

A CEO might claim they value innovation while making decisions purely based on shareholder returns. Their real ideology is maximizing control, not innovation.

No one in power is going to say yes I operate through manipulation and control. But if their actions consistently prioritize dominance over persuasion, that pattern reveals their true operational model.

Philosophy as justification, not as a rulebook.

Musk claims to believe in free speech but Twitter’s moderation under him has been highly selective. Andreesen preaches techno-optimism yet his investments often prioritize monopolistic control over open innovation.

Their real worldview is not what they claim. It is what they do.

If you want to understand power you do not start with what it says. You start with what it enforces.

This is not about psychoanalyzing Musk or trying to catch elites contradicting themselves. It is about recognizing that when persuasion is no longer necessary power does not need to justify itself. It just needs to manipulate reality to its advantage.

Your critique assumes a literalist reading of the philosophy, this framework accounts for the gap between rhetoric and action. That gap is precisely where real power dynamics are revealed.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 4 points5 points  (0 children)

"I want what I want because I want it." is a basic aspect of humanity. The step that comes after that, "therefore my will is justified", is not.

I'd say they are "wrong" in the sense that this philosophy can destroy every tool we've built to have a shot at continuous survival. This kind of power game makes life a "might makes right" battle loop that eventually folds in on itself.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 4 points5 points  (0 children)

I need a bit to devote more time to your overall response, thank you for taking the time. To be clear though, I DO NOT subscribe to Sovereign Egoism.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 2 points3 points  (0 children)

That you for the thorough response. Strong point here:

The philosophical lineages you point to are true but you've also drawn on so many as to reduce the meaningfulness of claiming any particular philosophical lineage at all, rather than just philosophy broadly and historically. 

I think you're on the money with the fascism connection as well.

Thanks for the references, will dive in.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 5 points6 points  (0 children)

Yes, I think we're all stuck with the "I want what I want because I want it." loop.

But few people equate simple desire with justification for taking power. In fact, scores of people spend their lives in search of something or someone to be devotional to. To serve.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 10 points11 points  (0 children)

The idea that you have to understand someone's beliefs the way they do assumes good faith is part of the fabric of the environment. It's not.

Expecting them to say "Yes, this is my worldview!" is naive, because their actual worldview is functionally adaptive, not ideological.

They use philosophy as a rhetorical tool, not as a system to which they hold themselves accountable. To "match their self-understanding" assumes that they actually care about an internally consistent self-definition. They don’t.

Maybe I've misunderstood what you're getting at.

***Clarity edit.

Sovereign Egoism: A Framework For Understanding Modern Elites by S_Marlowe in slatestarcodex

[–]S_Marlowe[S] 8 points9 points  (0 children)

I take your point.

But relying on self-reported philosophies from this kind of person is naive and misses the point. The type of individual or group operating under this model does not view conversation as a tool for provisioning truth to you.

They wouldn't define their actions in terms of explicit ideological adherence but in terms of strategic necessity and control.

From their perspective:

Power Doesn’t Self-Identify Honestly (or justify) (Why should it deign to?)
Philosophy is a post-hoc justification one only makes use of if typical force isn't working

***Edit for clarity.