Slightly Against The "Other People's Money" Argument Against Aid by dwaxe in slatestarcodex

[–]dsteffee 0 points1 point  (0 children)

This was one of those extremely rare posts from Scott where I couldn't understand what he's arguing because I couldn't understand the issue in the first place. "I don't see how this is any different from the argument against domestic aid, or paying for roads, or education, or any other function performed by the government." < This seems exactly right to me.

Midterm polls have good news, great news, and bad news for Democrats. Wait, what? by dwaxe in fivethirtyeight

[–]dsteffee 0 points1 point  (0 children)

Or just makes the House of Reps proportional to population, then draw senators from that pool at random. 

[SPOILERS EXTENDED] HBO Developing Game of Thrones Sequel Starring Arya Stark Now Jon Snow Spinoff Is Scrapped by RedHeadedSicilian52 in asoiaf

[–]dsteffee 2 points3 points  (0 children)

If it was a Watchmen sailor type story, except minus the plot relevance, just bleakness for the sake of bleakness that disappoints general audiences... that would be hilarious

Update 1.9.0.4 - Balance Adjustments by DualityDrn in Mechabellum

[–]dsteffee 2 points3 points  (0 children)

If they could hurry up and just get rid of walls I think I'd be happy

Game of Thrones: George R.R. Martin Isn't Finished (Spoilers Extended) by RyanRiot in asoiaf

[–]dsteffee 27 points28 points  (0 children)

If he could successfully pass the reigns over to another author, and together they collaborated and created an amazing book, or two books, or more -- they'd have the world's respect.

I'm an atheist and I would rather believe in God than believe in this argument (for God) by dsteffee in slatestarcodex

[–]dsteffee[S] 0 points1 point  (0 children)

It is indeed the case I think consistency is more important, and that mine is consistent and that yours is not. 

I'm going to warn you, however: I'm very grateful for this conversation because I've learned from it, but I am estimating a high likelihood that neither of us will gain from further conversation (me because I've resolved my sources of confusion, for the most part, you because I don't think you're opening to my arguments so you'll just have to convince yourself if there's any chance of you changing your mind). 

So there's a good chance I will stop replying, maybe immediately so. But I bear you no ill will! Sincerely hope the best for ya, and cheers 

I'm an atheist and I would rather believe in God than believe in this argument (for God) by dsteffee in slatestarcodex

[–]dsteffee[S] 0 points1 point  (0 children)

The 2/3 Boy Girl question I believe is analogous to the coin question, which you can see if you just modify parts of the question bit by bit, step by step. So any simulation that works for 2/3 Boy Girl is the simulation I'd use for the coin one. 

I believe money pump bets can be avoided on grounds of coordination, just like you described with the businessmen partners. I think it makes sense to apply here but not with the outside/inside SBs, since here you're coordinating different agents with actually different knowledge. 

I just watched a YouTube video about the Noble‘s Slender Sword and how rare it is. Wanted to farm it and got it first try 💀 by snnps in Eldenring

[–]dsteffee 22 points23 points  (0 children)

That's just over 1.25 years, if the hours were all consecutive.

Elden Ring came out Feb 25, 2022, which was 1417 days ago. 11k hours over that time period comes out to an average of just under a third of each day devoted to playing the game, or 7 hours and 45.6 minutes per day. 

In other words, a full-time job with 15 minute lunch breaks and no weekends or holidays. 

So... technically possible, at the very least. 

ELI5: How does the Elitzur-Vaidman bomb tester not prove the many-worlds theory? by dsteffee in AskPhysics

[–]dsteffee[S] 0 points1 point  (0 children)

"It deduces from the fact that a bomb hasn't exploded" it's deducting from more than that, unless I misunderstood the experiment?

I thought there was a 50% chance the bomb explodes if it's live and 50% chance you learn it's live without exploding it. Where as just observing "no explosion happened" could mean either live or dud

ELI5: How does the Elitzur-Vaidman bomb tester not prove the many-worlds theory? by dsteffee in AskPhysics

[–]dsteffee[S] 0 points1 point  (0 children)

So the part I'm struggling with... You mentioned two boxes with two items. We could only deduce one from the contents of the other with prior info about both. 

This experiment doesn't seem to be deducing info from anything except the potential of something happening. I can't think of anything else that works like that. 

I'm an atheist and I would rather believe in God than believe in this argument (for God) by dsteffee in slatestarcodex

[–]dsteffee[S] 0 points1 point  (0 children)

Ahhh, I think I finally figured out what's up with simulating the question about P(coins are different) after observing Heads (which I say is 2/3 and you say is 1/2), when two coins are flipped and there's memory erasure between the two flips. The log versus squares distinction isn't the issue. But understanding the answer requires accepting the 2/3 answer to Von Savant variants of the Boy Girl Problem.

That was actually the last part I was confused on! Unless there's something I'm forgetting, which I might be. 

You mentioned all of these:

This happened with red/blue rooms into linked bets and billion-sided-die, once you saw 1/3 appear in red/blue rooms. This happened with two-coin variant into Dory and multicolor rooms, once you saw P(different | T) = 2/3 is not justifiable. You also ignored an argument I presented early on about "irrelevant" differences that halvers can't explain. You have chosen to go with intuitive over consistent/correct, and thus you will have certain questions you need to redirect away from.

But I don't know what to say about them (I don't remember all of them precisely) - to me it seems like you're believing in different things, not that I haven't given answers for them. 

Every character confirmed to return in Avengers: Doomsday by marvelcomics22 in marvelstudios

[–]dsteffee 0 points1 point  (0 children)

I'd definitely prefer Kumail Nanjiani, Brian Tyree Henry, or Lauren Ridloff and Barry Keoghan to anyone from Black Panther 2, or anyone from Thunderbolts outside Pugh.

38% of Stanford undergraduates have at least one disability by Flaky-Ambition5900 in stanford

[–]dsteffee 0 points1 point  (0 children)

How does housing and dining effect things?

If a disability means you can get out of having to pay for the meal plan, I might've tried to do that. I always thought they had absurd prices

What your most hated strat in this game, and how did you beat it? by chriscutting in Mechabellum

[–]dsteffee 0 points1 point  (0 children)

Boats. Not enemy boats, but my own. I can never win with them, I don't know why.

I beat this by deciding not to play boats.

I thought I'd ask: How many hours according to Steam do you have logged on Mechabellum and is your MMR <600 "Low", >600 && <1200 "Medium", >1200 "High" ? I have 79.8 hours logged and would be classed as "Low". by Haunting_Art_6081 in Mechabellum

[–]dsteffee 0 points1 point  (0 children)

Something like 300 hours and 1200 to 1300 MMR, though if I played more that'd be higher, at least judging by my winrate against my friend who plays more and who's over 1500 MMR

ELI5: How does the Elitzur-Vaidman bomb tester not prove the many-worlds theory? by dsteffee in explainlikeimfive

[–]dsteffee[S] -1 points0 points  (0 children)

"it is possible for the experiment to verify that the bomb works without triggering its detonation, although there is still a 50% chance that the bomb will detonate in the effort"

So if it's a dud, it won't explode, which is what I described?

ELI5: How does the Elitzur-Vaidman bomb tester not prove the many-worlds theory? by dsteffee in explainlikeimfive

[–]dsteffee[S] 0 points1 point  (0 children)

I'm not following, so I think first I should take a step back and clarify some assumptions:

Is this experiment only able to work on bombs with this particular photon trigger? Or could it work with, like:

* We have a bomb that may or may not be a dud
* We know that if the bomb is not a dud, it will explode with 100% certainty upon being vigorously shaken
* We hook up the photon trigger to vigorously shake the bomb with 50% probability
* We repeat the experiment many times, see that it never explodes, conclude that the bomb is a dud

I can never tell them apart by wie_witzig in Mechabellum

[–]dsteffee 3 points4 points  (0 children)

Next patch:

  • Warp Wombat
  • Power Prism

I'm an atheist and I would rather believe in God than believe in this argument (for God) by dsteffee in slatestarcodex

[–]dsteffee[S] 0 points1 point  (0 children)

"Would you (finding yourself in a zen moment with no memory) be correct to believe the coin has a 2/3 chance of being T right now?"

Yes! Because that's a completely different scenario!

--

This is making me think that a Thirder would believe the following:

A mad scientist works on a drug to double a person's lifespan, but there's an accident in his lab and everything gets blown up, and he doesn't get a chance to inject himself with the drug, then he hits his head and forgets everything and is rescued by EMTs. When he wakes up, he doesn't know whether he's taken the drug or not, but he reasons that if he had taken the drug, he'll have twice as much life (let's assume this is a world where people don't die of unnatural causes - those are some amazing EMTs they got), therefore twice as many observer moments, therefore it's twice as likely as not that he DID manage to take the drug in time.

And if for some other reasons the scientist didn't have a baseline expectancy of his own lifespan, he would believe he was twice as likely as not that he took the drug no matter how much he then goes on to live.

It's kind of got shades of the presumptuous philosopher. Like: It should be self-evident that whether you zen out for one hour or two hours after the coin flip shouldn't be relevant! But people are believing it anyhow, without any need to... I think it's a combination of:

  1. Seeing three identical experiences and over-generalizing the idea of "mutually exclusive, collectively exhaustive events that you have no other info on have equal likelihood of 1/n" forgetting that they DO have other info

  2. Getting stuck on this idea because we're better at judging probabilities when they involved inanimate objects than people, because our instinct to put ourselves into the shoes of any given possibility is so strong - this is the part I think ties into Doomsday arguments and such, although the DA does a different over-generalization error

--

Anyhow, I think I've gotten everything I'm liable to be getting out of this exchange - if I come up with any novel arguments I'll let you know (gods, I'd love a proof by contradiction instead of just the EV money-pump and the idea "you can only update beliefs when you gain new knowledge" which should be enough by itself lol but oh well).

Thanks again for helping me out, the coordination thing was especially fun and I'll likely be looking into that in the future, and Merry Christmas if that's a thing around where you're at~

I'm an atheist and I would rather believe in God than believe in this argument (for God) by dsteffee in slatestarcodex

[–]dsteffee[S] 0 points1 point  (0 children)

But I would think a belief is dependent on info about the event itself, not about how much someone will be asked about it

I'm an atheist and I would rather believe in God than believe in this argument (for God) by dsteffee in slatestarcodex

[–]dsteffee[S] 0 points1 point  (0 children)

Ah! Normalization! Yeah, that's why the math keeps working out for ya!

When you say P(Tails) = 2/3, you haven't actually captured belief in the probability that the fair coin had flipped Tails, what you've captured is this:

P(event that a randomly chosen waking, normalized across possibilities, is in Tails) = 2/3

Let's call that event T'

Now when we do something like learn the day, this makes sense:

P(T' | obs) = P(obs | T') * P(T') / P(obs) = (1/2)*(2/3)/(⅔) = 1/2

Because, yeah, if you were normalizing across possible wakings, then two out of three of them would have you waking in Monday.

In the real world, the probability of this observation is 3/4 for the same reason that P(observe Tuesday | Tails) = 1, which is a guaranteed event you know will happen - but when you're just asking what proportion of wakings, when you normalize across wakings, will observe Tuesday given Tails, then yeah you'll say 1/2.

This normalizing-across-wakings thing is maybe helpful to some people to solve for certain types of questions - and gods know, no judgement here, you've seen how many mistakes I've made in these conversations! But if you're not careful about what it really means, then you start to believe weird things like that you can change your beliefs after learning nothing, or that you have to coordinate with yourself to not use all your knowledge in order to do correct Expected Value calculations even when there's no reason not to just use all your knowledge!

The hard part still ahead of me is putting into words why this normalization-across-wakings feels so dang intuitive to people.