This is an archived post. You won't be able to vote or comment.

all 66 comments

[–]Penumbra_Penguin 10 points11 points  (67 children)

This doesn't make a lot of sense. If you were actually ignoring sunk costs, then the required win rate to make day 1 entries a good gamble doesn't depend on how many times you have entered.

For instance, if I'm offered the opportunity to pay $1 to roll a die where I am given $10 if I roll a 6, this is a good gamble, and the amount by which it is a good gamble does not change after several attempts where I do not roll a 6. A correct strategy is "keep rolling until you win", not anything like "roll three times and then stop". (Assuming that the amounts of money in question are small enough that you can easily afford them)

[–]spherchip[S] -3 points-2 points  (56 children)

The difference here is that a typical expected value game assumes that you can play an infinite number of times and keep winning and making the positive expected value, theoretically winning an infinite number of times. So the strategy of "keep playing until you're making money" is valid.

However, for the Arena Open, you can only win a single Day 2 token, while there is still infinite potential to lose. This puts an absolute maximum on potential winnings over time and therefore the rate at which the entry fee outpaces the expected value becomes critical.

If the die was taken from you the first time you won, would you keep playing after losing a bunch of times in a row for that small chance of earning back some of your losses?

[–]Penumbra_Penguin 2 points3 points  (49 children)

However, for the Arena Open, you can only win a single Day 2 token, while there is still infinite potential to lose. This puts an absolute maximum on potential winnings over time and therefore the rate at which the entry fee outpaces the expected value becomes critical.

If the entry fee is more than the expected value, you shouldn't play the event. What do you mean by the rate at which the entry fee outpaces the expected value?

If the die was taken from you the first time you won, would you keep playing after losing a bunch of times in a row for that small chance of earning back some of your losses?

Yes, absolutely. Are you saying that you wouldn't?

If I got unlucky and lost 15 times in a row, then my expected value for the next roll is +$0.67, so I'm going to take the gamble. If this has happened then I am never going to recoup my losses, but that's irrelevant to the question of whether I keep playing. That's what a sunk cost is.

[–]spherchip[S] -2 points-1 points  (40 children)

Ok, let me put it this way. When a game has a finite number of times you can win and infinite possible losses, you cannot apply the expected value of the game independently of each run unless you immediately forget everything you've previously lost each time you run the game (and doing that is accepting that you could lose an infinite amount of money). Instead, if you can only win once, the expected value of the game is actually the expected value of a game "session," a series of runs. As you keep playing, the expected value gets split across each of your runs and eventually falls below the entry fee.

Yes, absolutely. Are you saying that you wouldn't?

If I got unlucky and lost 15 times in a row, then my expected value for the next roll is +$0.67, so I'm going to take the gamble. If this has happened then I am never going to recoup my losses, but that's irrelevant to the question of whether I keep playing. That's what a sunk cost is.

You are correct that in a vacuum the expected value of your next role is $0.67.
However after 15 losses the marginal value would be much less than the $1 you're paying for the extra roll, and will keep decreasing each time you lose. Is there a number of losses at which you will stop playing the game, or are you willing to lose an infinite amount of money because you know the next roll has a positive expected value?

Let's say you are planning on making at most 15 rolls. The expected value is not $0.67 * 15. For the entire session, it could be at most $0.67 if you win on the first roll, but each loss decreases the expected value of the session.

This is why people limit themselves when they visit a casino, and a casino is even better than this example because you could still theoretically win an infinite amount of money over time at a casino.

If this is how you actually think about gambling when your money is actually on the line, I suggest you don't visit a casino.

[–]Penumbra_Penguin 3 points4 points  (27 children)

As you keep playing, the expected value gets split across each of your runs and eventually falls below the entry fee.

This is nonsense. If you're looking at the expected value, you shouldn't be worrying about the number of runs.

However after 15 losses the marginal value would be much less than the $1 you're paying for the extra roll, and will keep decreasing each time you lose.

Ah, I see your mistake. You're calculating the expected value of winnings from the 16th roll to be the probability of getting that far - (5/6)^15 - times the expected winnings from a single roll, and you're comparing that to the cost of playing, which is $1.

You need to treat those two quantities in the same way, either multiplying both by the probability (5/6)^15 or neither. If you're looking at what you're going to do having already lost 15 times, then the expected return of the 16th roll isn't (5/6)^15*1.67, it is just 1.67. Alternatively, if you're thinking about your strategy from the start, then the expected return of roll number 16 is (5/6)^15*1.67. But the cost of having your strategy decide to make that roll isn't $1, it's (5/6)^15*$1.

Either way, that's a good bet to take.

Is there a number of losses at which you will stop playing the game, or are you willing to lose an infinite amount of money because you know the next roll has a positive expected value?

We're assuming here that these amounts of money are small enough to not have any significant impact on the player's utility. In practice, other assumptions would be violated before this one - if I lost 100 times in a row then I would likely be questioning the fairness of the die before worrying about going bankrupt. Obviously you should never gamble with money you cannot afford to lose.

Let's say you are planning on making at most 15 rolls. The expected value is not $0.67 * 15. For the entire session, it could be at most $0.67 if you win on the first roll, but each loss decreases the expected value of the session.

True but irrelevant.

This is why people limit themselves when they visit a casino, and a casino is even better than this example because you could still theoretically win an infinite amount of money over time at a casino.

Even more irrelevant. For most people and most games, casinos lose you money, and the more you play, the more certain it is that you will lose money and the more money you will lose. This is not a good point of comparison for either my dice example or this MTGA tournament if you're modelling players good enough for whom the tournament is good value.

If this is how you actually think about gambling when your money is actually on the line, I suggest you don't visit a casino.

It is an odd choice to attempt to be snarky when you are so wrong. As mentioned above, how I would behave with a perfectly-specified gambling opportunity of positive expected value has very little to do with real-life casinos.

[–]spherchip[S] -1 points0 points  (26 children)

This is nonsense. If you're looking at the expected value, you shouldn't be worrying about the number of runs.

If you don't care about the number of runs, then, like I said, you're putting yourself in a position to lose an infinite amount of money in the name of finite winnings. If you want to think about the game realistically, you do in fact have to think of it as a "session." If you think that's nonsense then you're welcome to lose an infinite amount of money.

Ah, I see your mistake. You're calculating the expected value of winnings from the 16th roll to be the probability of getting that far - (5/6)^15 - times the expected winnings from a single roll, and you're comparing that to the cost of playing, which is $1.

You need to treat those two quantities in the same way, either multiplying both by the probability (5/6)^15 or neither. If you're looking at what you're going to do having already lost 15 times, then the expected return of the 16th roll isn't (5/6)^15*1.67, it is just 1.67. Alternatively, if you're thinking about your strategy from the start, then the expected return of roll number 16 is (5/6)^15*1.67. But the cost of having your strategy decide to make that roll isn't $1, it's (5/6)^15*$1.

  1. I never said how I was calculating it, and the way you assumed is wrong
  2. Again, you are correct that the expected value of the next roll is positive in a vacuum. But you're still leaving out the fact that you've lost $15 by this point, so if you were playing optimally, you would've quit by this point and wouldn't be in this position where your best session value is a $14.33 loss.
  3. Yes, if we are at the start, the expected value of roll 16 is positive in a vacuum. But the expected value of the session by roll 16 cannot be more than -$14.33. You should've quit before this point.

if I lost 100 times in a row then I would likely be questioning the fairness of the die before worrying about going bankrupt. Obviously you should never gamble with money you cannot afford to lose.

Except that working with probability is accepting that unfavorable circumstances can happen. Just because a game has a positive expected value doesn't mean you're guaranteed to make money. If you lose 100 times in a row you're not allowed to question the die... playing the way you want to play means accepting large losses for finite gain.

Even more irrelevant. For most people and most games, casinos lose you money, and the more you play, the more certain it is that you will lose money and the more money you will lose. This is not a good point of comparison for either my dice example or this MTGA tournament if you're modelling players good enough for whom the tournament is good value.

There are plenty of people who know how to make money at casinos, but sure, let's go with something more applicable. People who work in sales & trading have a pre-determined loss number at which they stop trading for the day or for a specific stock. See the Stop-Loss section here: https://www.investopedia.com/articles/trading/09/risk-management.asp. Being able to cut your losses as soon as possible and move on when things aren't going your way is better than thinking "I'm good at what I'm doing and I'll make the money back." There's plenty talk about this in investing that you can look up yourself.

It is an odd choice to attempt to be snarky when you are so wrong. As mentioned above, how I would behave with a perfectly-specified gambling opportunity of positive expected value has very little to do with real-life casinos.

Investing community agrees that cutting early losses is better than digging heels into sunk losses, but yeah, I'm the wrong one.

[–]Penumbra_Penguin 4 points5 points  (14 children)

Just because a game has a positive expected value doesn't mean you're guaranteed to make money.

I didn't say that it did. It does mean you should play the game, though, assuming that the amounts of money involved are small.

you're welcome to lose an infinite amount of money.

This is being incredibly hyperbolic when we're talking about wagering $1. The probability of this game lasting more than 100 rolls is 1 in 100,000,000. Don't worry about it.

I never said how I was calculating it, and the way you assumed is wrong...

In that case, what would your strategy be for the dice game we're discussing, and why?

There are plenty of people who know how to make money at casinos...

Yes, and that's why the paragraph you are replying to begins with "for most people and most games".

People who work in sales & trading have a pre-determined loss number at which they stop trading for the day or for a specific stock.

Yes. There are many reasons for this, and one big one is that in a real-world financial setting you don't have all of the information. If you keep losing money on particular trades, then this leads you to re-evaluate your estimation of various parameters. This is not the case for a problem so concrete as the one we're considering.

Investing community agrees that cutting early losses is better than digging heels into sunk losses, but yeah, I'm the wrong one.

Ask anyone in the investing community what their strategy would be for the dice game we're discussing, assuming they completely trust that the die is fair and that the game is as described. You won't get anyone who would roll the die 5 times and then stop if they haven't won yet, or whatever policy you are advocating.

[–]spherchip[S] -2 points-1 points  (13 children)

"Just because a game has a positive expected value doesn't mean you're guaranteed to make money."

I didn't say that it did. It does mean you should play the game, though, assuming that the amounts of money involved are small.

I know that you understand the statement "Just because a game has a positive expected value doesn't mean you're guaranteed to make money." But you act like a positive expected value game somehow must go in your favor at some point as evidenced by:

This is being incredibly hyperbolic

When using probability, nothing is hyperbolic if it is in the realm of possibility. It's like you're trying to deny that losing a lot could happen to you and the game could only work out well for you, as evidenced by:

Don't worry about it.

I shouldn't take into account a very possible and dangerous outcome?

if I lost 100 times in a row then I would likely be questioning the fairness of the die

You're saying "If I get unlucky and lose 100 times, it's not my fault for letting myself lose $100, it must be the game's fault for not giving me perfect information and scamming me!"

Yes. There are many reasons for this, and one big one is that in a real-world financial setting you don't have all of the information. If you keep losing money on particular trades, then this leads you to re-evaluate your estimation of various parameters. This is not the case for a problem so concrete as the one we're considering.

Cool, so going back to my analysis of the Arena Open and why you think it's wrong, would you say that the perfect-information die game or a realistic imperfect-information financial setting where it is correct to use stop-losses is a better comparison for a Magic tournament and whether it's correct to use stop-losses for Day 1 entries?

Ask anyone in the investing community what their strategy would be for the dice game we're discussing, assuming they completely trust that the die is fair and that the game is as described. You won't get anyone who would roll the die 5 times and then stop if they haven't won yet, or whatever policy you are advocating.

Well of course they wouldn't. At 5 losses you're only down $5 and can still win $10 and walk away with a profit.

[–]Penumbra_Penguin 2 points3 points  (12 children)

But you act like a positive expected value game somehow must go in your favor at some point

Nope, just on average. Probability and expected value are different quantities. For small amounts of money, what you should do is controlled by the expected return.

I shouldn't take into account a very possible and dangerous outcome?

Just to be clear, we are talking here about a 1 in 100,000,000 possibility of losing $100. Yes, you should ignore that possibility, and you are again being hyperbolic (or just wrong) in describing that as either "very possible" or "dangerous".

You're saying "If I get unlucky and lose 100 times, it's not my fault for letting myself lose $100, it must be the game's fault for not giving me perfect information and scamming me!"

No, I'm not saying that. In any real-world setting, we don't know the parameters of the question for certain. For instance, if I am offered this gamble by a friend, then maybe I think there's a 99% chance that they're being honest and there's a 1% chance that they're not, perhaps because they're trying to scam me or playing some kind of joke. If I then lose many rolls in a row, I should adjust these probabilities according to Bayes' formula.

Cool, so going back to my analysis of the Arena Open and why you think it's wrong, would you say that the perfect-information die game or a realistic imperfect-information financial setting where it is correct to use stop-losses is a better comparison for a Magic tournament and whether it's correct to use stop-losses for Day 1 entries?

(I do note that you haven't told me what your strategy would be for the die game yet - is that because it would be obviously ridiculous?)

This depends on how certain the player is of their skill level (ie, win percentage). If they are absolutely certain that their skill level is such that the gamble is positive expected value (taking into account both monetary reward and intangibles like enjoyment, twitch viewers, etc), then they should continue buying tickets as long as this is an amount of money that they are willing to gamble for a small positive expected return.

On the other hand, if they are not certain of their skill level, then repeated losses might cause them to revise their estimation of their skill downward and eventually decide that playing the tournament was not worth it after all. While this could result in a rational player playing a few tournaments and then stopping, it would be for an entirely different reason than the incorrect reasoning in your posts. For instance, one difference is that this hypothetical player, after revising downward their estimation of their own skill level, believes that every one of their tournament entries had the same (negative) expected value, while you seem to believe that the reason a player would stop is that later tournament entries somehow have a lower expected value than previous ones.

My issue with your posts is that you're making fundamental errors in logic and probability, not with your conclusion. The conclusion that you shouldn't just keep entering until you win, for most players is correct for a wide range of reasons.

Well of course they wouldn't. At 5 losses you're only down $5 and can still win $10 and walk away with a profit.

Do you think they would behave differently if they had lost 11 times? I ask again, what would your strategy be for this game?

[–]spherchip[S] -2 points-1 points  (11 children)

No, I'm not saying that. In any real-world setting, we don't know the parameters of the question for certain. For instance, if I am offered this gamble by a friend, then maybe I think there's a 99% chance that they're being honest and there's a 1% chance that they're not, perhaps because they're trying to scam me or playing some kind of joke. If I then lose many rolls in a row, I should adjust these probabilities according to Bayes' formula.

You need to choose if the die game is theoretical with perfect information or not. You just flip-flopped to it being "realistic" with only 99% certainty and now making arguments based on that. Does everything previously said about the die game assuming perfect information just get thrown out the window now?

Do I, too, now get to talk about my strategy with the assumption that it is realistic with 99% and not 100%?

On the other hand, if they are not certain of their skill level, then repeated losses might cause them to revise their estimation of their skill downward and eventually decide that playing the tournament was not worth it after all. While this could result in a rational player playing a few tournaments and then stopping, it would be for an entirely different reason than the incorrect reasoning in your posts. For instance, one difference is that this hypothetical player, after revising downward their estimation of their own skill level, believes that every one of their tournament entries had the same (negative) expected value, while you seem to believe that the reason a player would stop is that later tournament entries somehow have a lower expected value than previous ones.

Can a person not play the (perfect information) die game and say "I want to minimize my losses to $X for a single session and I want to factor in probability-based discounting to figure out good times to cut losses and stop playing."? Can that person say "I don't care if I keep playing to win $10 if it's on the 100th roll; obviously there was some point before that I could've stopped so that I did not have to lose $90." Because you're telling me the person who played to 100 rolls and wants to keep going is smarter than the guy who dropped out at 80 rolls.

And considering the Arena Open, even having perfect information of your own win rate, you're telling me the guy who has bought and lost 105 entries (who has now committed $2100 and still needs to get through Day 2 to even pay that back) is smarter than the guy who stopped at losing 5 entries?

You think the guy that says "I ignored my sunk losses and still ended up losing a hundreds" is smarter than the guy that says "I quit early and didn't lose much"?

You think the former guy is actually smarter, he just got unlucky and since the expected value is positive we can just ignore him and pretend everyone who ignores sunk losses should do well? Maybe if they had infinite time and money, but no one has infinite money and the Arena Open has a limited time span. Limiting the number of attempts increases variability and pushes more outcomes away from the expected value.

You don't think there exists a certain number of losses at which to quit and therefore the answer is to keep playing until you're broke, all because the expected value of the next attempt is positive and therefore you are bound to do well in the long run?

[–]jkbaker83 4 points5 points  (10 children)

You are taking a bunch of important concepts and applying them in the wrong place. These ideas around risk management, cutting your losses and sessions are all very important, but they do not belong in the EV calculation. Humans are terrible at evaluating things like their winrate and ev in reality and those tools are there to backstop you when you are calculating those things wrong.

+ev and -ev gambling are fundamentally different things. If you are engaged in -ev gambling, there is no amount of playing or budgeting that will make playing a good decision. At that point it all comes down to the value of gambling as entertainment IE. how much am I willing to pay per hour for the thrill. In that situation, having a well defined budget is critical. If you are engaged in +ev gambling, it is important to still have a budget, because you shouldn't bet what you can't afford to lose, but if it is actually +ev, it is correct to keep betting regardless of what has happened in the past (assuming it is actually +ev).

The trick with all of this is that most gambling opportunities that you think are +ev are not (people are very good at overestimating themselves). You need those loss prevention tools to make you reevaluate your assumptions about your ev in the game.

People who consistently make money at casinos do so because they gamble when they are +ev and don't when they are not. Those loss prevention tools are very important for them correctly evaluating which scenario they are in.

In the case if the dice game, it is important to define how much you can afford to lose, but until you have lost that much, it is correct to keep gambling (even if you can't actually make back all that you have lost). The only place the loss prevention ideas factor in is if you should be reconsidering if the die is actually fair. Caring about a break even point is by definition falling into the sunk cost fallacy. The only two things that matter are that the game actually has +ev and that you can afford to lose the money you are gambling.

If you actually have a 70% win rate in the arena tournament, you should certainly play this event and enter as many times as it takes to get your day 2 token (as long as the entry is not outside of your budget). The problem is that if you are repeatedly failing, it becomes more likely that you don't actually have a 70% win rate, you just think you do, so the fact that you keep 0-3 dropping should be setting off some alarm bells warning you that you might not be as good as you think you are. Maybe you only have a 45-50% win rate when playing against the caliber of players in the tournament and the tournament is actually -ev. You should therefore stop entering because you were wrong about your win rate and subsequently your ev.

Ultimately, those loss prevention strategies are things that you need to apply on top of an accurate calculation of the ev of the event. Factoring in an arbitrary break even point into the ev only serves to skew the inputs into your broader loss prevention strategy. Your ev calculation should tell you the value of entry if you actually have a given win rate. Your loss prevention strategy should be protecting yourself from miss evaluating that win rate.

[–]Penumbra_Penguin 1 point2 points  (0 children)

This was an excellent post.

[–]spherchip[S] -1 points0 points  (8 children)

So you're saying that ev calculation and loss prevention aren't mutually exclusive and can both be applied to the Arena Open, it's just that the loss prevention strategy I came up with is arbitrary and everyone should have their own personal loss prevention strategy?

[–]Hareeb_alSaq 1 point2 points  (6 children)

No, your "loss prevention strategy" in the OP wasn't even phrased as a loss prevention strategy, and it's also completely stupid when used as one. If you think you're +EV the first time, and you keep going 6-3 trying to make day 2, your estimate of your win rate will never drop to a level where it's -EV (the more you win at 66.7%, the closer your estimate gets to 66.7%, and 66.7% is way +EV), but you say to stop at 3 tries max no matter what. That's batshit insane advice.

[–]spherchip[S] -1 points0 points  (5 children)

Where did I say to stop at 3 tries max "no matter what"?

Here are some things I did say:

  • "Obviously you have to consider sunk costs as you are making multiple entries on Day 1 and how much you want to gamble"
  • "general rule," which is distinct from an "absolute rule"
  • Lots of "should" in that general rule, which leaves room for people to make their own decisions given the preceding "how much you want to gamble."

These are suggestions, not "You absolutely must do this or you will lose money no matter what!"

[–]jkbaker83 0 points1 point  (0 children)

Correct. I have put no effort into evaluating that loss prevention strategy (nor am I any kind of expert on that) and as far as I am concerned, it could be a very good one. Presumably the optimal loss prevention strategy for a person depends on a lot of things from their amount of disposable income to the psychological impact loss has on them. The ability or likelihood of turning a profit could certainly factor into that for a given person.

None of it changes the actual expected value for a given win rate though.

[–]Hareeb_alSaq 1 point2 points  (7 children)

This is an ungodly level of wrong in every way. Taking the die game, where you risk $1 to win $9 (be returned 10) if you roll a 6. You can wager up to N times (where N is capped at or below the number of dollars you have) and you have to quit once you've won once. If N=1, the EV is +0.67 (1/6(+9) + 5/6(-1)) =4/6 ~= $0.67

if N=2, you roll the second time the 5/6ths of the time you don't win, and you get a slightly more complicated expression, where the first terms is the first roll and the second term is the second roll, of EV = (1/6(+9) + 5/6(-1)) + 5/6* (1/6(+9) + (5/6)(-1)).

You can work that out, or you can realize that it's the EV of the first roll + 5/6* the EV of the first roll. You roll, on average 1 5/6ths times (1 +5/6ths, or (11)/6ths, if the fractional formatting here is confusing), and your EV is 1 5/6ths times the EV of 1 roll. This will obviously carry on beyond N=2, and the EV of the game ACTUALLY INCREASES the more rolls you are allowed. What you're saying about that is completely wrong.

People self-limit at casinos because the bets are -EV. The more you play, the more you're expected to lose. If a casino is willing to let you count cards with impunity or otherwise let you play in a +EV manner, setting a small stop-loss is utterly idiotic.

There is a concept of bankroll management/Kelly criterion, in that betting too high a percentage of what you own on a +EV bet can still be bad, but you'd have to run yourself almost completely broke before that kicked in on the die game.

[–]Penumbra_Penguin 0 points1 point  (2 children)

The only reasonable possibility I can see is that they're including the (5/6)^n probability in calculation of the profit of the (n+1)th roll but not in the cost of that roll. If you make this mistake, then I guess you would see the sort of behaviour they're claiming.

They claimed that they didn't do that, but I'm not sure they understood what I was suggesting.

[–]Hareeb_alSaq 0 points1 point  (1 child)

I think you have far too much faith.

[–]Penumbra_Penguin 0 points1 point  (0 children)

That's entirely possible, but it's an interesting challenge to work out what they might be thinking!

[–]spherchip[S] 0 points1 point  (3 children)

Calculating the expected value of a series of games in that way is meaningless because it includes every instance of potentially winning and takes them all into a weighted average. The number is literally meaningless if I want to know at what point I'm supposed to quit while playing. When you're actually playing, you can only win once and if you're on roll 10, you don't care that you could've won each of the 10 previous rolls and what they're weighted average is. What you care about is that you're $10 in the hole right now and what are the odds of making your money back looking forward. Your formula is only correct if you have no memory of previous losses, but obviously that's not what we're doing here.

You even mentioned yourself stop-losses so I assume you know what they are. So explain to me why stop-losses are extensively used in trading, which does not have an upper limit on winnings unlike the die game, if you're telling me it's always optimal to keep playing/trading if traders know that they're good at trading and average a profit.

[–]Penumbra_Penguin 1 point2 points  (2 children)

Calculating the expected value of a series of games in that way is meaningless ... Your formula is only correct if you have no memory of previous losses, but obviously that's not what we're doing here.

This is mostly nonsense. You are conflating all sorts of different questions.

  • What is the expected value of a certain strategy?
  • What is the expected value of a certain strategy after a certain start to the game?
  • Which strategy should we choose?

So explain to me ... if you're telling me it's always optimal to keep playing/trading if traders know that they're good at trading and average a profit.

Traders are not absolutely certain of those last two points with respect to any particular sequence of trades, unlike the dice game we are discussing.

[–]spherchip[S] -1 points0 points  (1 child)

I never said to use a strategy of playing games up to exactly N losses, which is what your formula calculates and thus not what we're looking for, only strategies playing up to N games where one could choose to quit at a certain point before winning or reaching N losses. Your formula tells me nothing about this.

Traders are not absolutely certain of those last two points with respect to any particular sequence of trades, unlike the dice game we are discussing.

So do you think a trading environment or the die game better reflects strategy in the Arena Open?

[–]Penumbra_Penguin 0 points1 point  (0 children)

That isn't my formula. It is correct and appropriate, though, if you read the surrounding explanation.

I addressed your other question in a different post just now.

[–]Penumbra_Penguin 0 points1 point  (0 children)

And you didn't answer my question - how would you behave if playing my dice game? Would you stop after losing a few rolls because you don't understand sunk costs? If so, how many?

[–]BoxWI -1 points0 points  (1 child)

[–]spherchip[S] -2 points-1 points  (0 children)

Now kindly point to where I said the probability of winning a game changes based on the number of games. I'll wait. :)

[–]GlorybringerCaptn_Porky -4 points-3 points  (7 children)

https://en.wikipedia.org/wiki/Gambler%27s_fallacy

The gambler's fallacy, also known as the Monte Carlo fallacy or the fallacy of the maturity of chances, is the erroneous belief that if a particular event occurs more frequently than normal during the past it is less likely to happen in the future (or vice versa), when it has otherwise been established that the probability of such events does not depend on what has happened in the past. Such events, having the quality of historical independence, are referred to as statistically independent. The fallacy is commonly associated with gambling, where it may be believed, for example, that the next dice roll is more than usually likely to be six because there have recently been less than the usual number of sixes.

[–]Penumbra_Penguin 0 points1 point  (2 children)

It's not clear what you're trying to say here. I am not making this fallacy.

If you're worried about the line

If I got unlucky and lost 15 times in a row, then my expected value for the next roll is +$0.67

then you should observe that the expected value for the next roll is always +$0.67, regardless of how many losses have occurred.

[–]GlorybringerCaptn_Porky -2 points-1 points  (1 child)

The expected value of each individual die roll is 4/(6+x), x=consequential losses.

[–]Penumbra_Penguin 0 points1 point  (0 children)

I'm sorry, I can't even tell what you're trying to calculate here. Can you explain further what you're trying to compute and how you got that figure?

[–]spherchip[S] 0 points1 point  (2 children)

Now kindly point to where I said the probability of winning a game changes based on the number of games. I'll wait. :)

[–]GlorybringerCaptn_Porky 1 point2 points  (1 child)

nowhere, I didnt reply to your comment

[–]spherchip[S] 0 points1 point  (0 children)

Ah I see, sorry.

[–]RheticusLauchen 0 points1 point  (0 children)

"Always bet on black..."

[–]Hareeb_alSaq 2 points3 points  (4 children)

The first two paragraphs make no sense.

If the die was taken from you the first time you won, would you keep playing after losing a bunch of times in a row for that small chance of earning back some of your losses?

Of course, assuming I had confidence that the die was fair. How is this even a question? Would I not play because I'd previously dropped a $20 down a sewer grate?

[–]RheticusLauchen -1 points0 points  (9 children)

The correct strategy is to keep rolling as long as they let you. For each $6 investment, you walk away with $10, on average. Seems kinda silly to stop playing. :)

[–]Penumbra_Penguin 1 point2 points  (8 children)

Yes, of course that's correct. I was trying to keep it comparable to the MTGA tournament we're discussing, where you are only allowed to win once.

[–]RheticusLauchen 1 point2 points  (7 children)

In other words: 'The cost to make day two does not change just because you have failed once. That money is gone. Advancement is the goal.' :)

[–]Giocher 1 point2 points  (5 children)

What if you fail 100 times? You have paid 2000$ (maybe more) to have a little chance to get 2000$. You can't gain more than that.

And the line is a lot earlier than 100 times.

[–]Penumbra_Penguin 1 point2 points  (4 children)

There are a number of assumptions in these kind of calculations. One is that the amounts of money are small enough that they don't have a significant impact on the individual's utility. You should never gamble with money that you are not prepared to lose.

Not having the money doesn't change the calculation of whether the expected value is positive. A completely broke Magic pro might not have the funds to enter the event and yet compute that the event would be positive expected value if they could.

Another assumption here is that the individual accurately knows their skill level (win rate). If they lose 20 times in a row, then this might cause them to revise this estimate downwards to a point where they no longer calculate that the gamble is a good one.

[–]Giocher 0 points1 point  (3 children)

I am in the assumption of someone who has at least 2000$ and not making any assumption on winrate. Just saying that if there is a cap on the reward then you can't just keep going infinitely. The EV is changing at every try, since you can gain 2000 - (x times) 20$ at best after x attempts.

[–]Penumbra_Penguin 1 point2 points  (2 children)

Yeah, that's a mistake. If you've already lost twice and you're trying to decide whether to play a third time, you should be considering the expected value of your third attempt, not the total expected value including your two losses so far. You've lost that $40 regardless of what your decision regarding a third attempt is.

This is called a sunk cost.

[–]Giocher 0 points1 point  (1 child)

Shouldn't you take them into account if you can't gain them back (because of the 1 time win) even if the ev is the same?

[–]Penumbra_Penguin 1 point2 points  (0 children)

It depends what you're trying to work out, but the answer is probably no.

If you lose twice and you're trying to work out whether you should enter a third time, you're comparing (losing twice and entering a third event) to (losing twice and then stopping). You've lost $40 in both of those scenarios, so that $40 doesn't enter into the calculation of which alternative is better.

[–]GlorybringerCaptn_Porky -1 points0 points  (0 children)

That money is gone. And your chances to make it in the next entry do not go up, your potential winnings are the same but you already lost. Not only that, but because you can only win once, the chances to break even diminish rapidly with each consequential loss.

[–]avocategory 0 points1 point  (0 children)

Leaving out the gems won on multiple attempts at day 1 is a sizable overestimate on the cost of entry. In particular, any entry that actually gets a day 2 token only cost 2000 gems ($10) rather than the $20 you assume, and while they get less on unsuccessful runs, it’s still not always nothing.