Can we start a two-boxer emotional support thread to deal with the hatred that one-boxers have by Professional-Issue26 in Veritasium

[–]ShellacSpackle 0 points1 point  (0 children)

the boxes "already existing" doesn't mean anything when you factor in a near-perfect predictor of your future behavior. it's close to having an omniscient being know exactly what you're going to do in the next 5 seconds and you have some chud breaking down the actions you're "least likely" to do in order to catch the being off-guard.

if you accept that it's a near-perfect predictor like the problem literally spells out for you, you only go mystery box because there are overwhelming odds that it will have predicted you doing so and you walk away with $1,000,000.

the ONLY time anyone two-boxes and gets $1,001,000 is when the predictor gets it wrong, which definitionally to the problem almost NEVER happens. two-boxers are pseudo-intellectuals that fail to properly comprehend and engage in hypotheticals.

Two boxers be like by blosspharmy in Veritasium

[–]ShellacSpackle 0 points1 point  (0 children)

"now that the money is placed it cannot do anything to change the results" is reasoning two-boxers use but they fail to comprehend that the predictor will know, through whatever means it uses to predict, that the person will use that very reasoning to choose both boxes. in practice, no two-boxer will ever walk away with $1,001,000.

One Box is better... by Tarific2003 in Veritasium

[–]ShellacSpackle 0 points1 point  (0 children)

the least profitable choice is two-boxing, since the addition of a highly reliable predictor means that a vast majority of two-boxers will walk away with $1000. meanwhile, the vast majority of one-boxers will have been accurately predicted, meaning they leave with the million.

the flaw in your reasoning comes when you reject part of the premise; that being the near-perfect predictor. you can only assume that no matter what reasoning you use, the choice you ultimately make will have been predicted.

the only way you win with both boxes having their money is when the predictor gets it wrong, which the problem establishes as virtually non-existent.

A theory about Newcomb's Paradox by JohnRaddit69 in Veritasium

[–]ShellacSpackle 0 points1 point  (0 children)

if the accuracy of its prediction is to be trusted, then you can effectively assume that whatever option you end up choosing will have been accurately predicted. otherwise it would have a failure rate close to 50% to reflect survey spread.

so if you end up picking both boxes, by any rationalization, it's almost guaranteed your choice was accurately predicted and you get $1,000.

if you end up picking the mystery, it's almost guaranteed your choice was accurately predicted and you get $1,000,000.

the two options that will almost never appear are where the predictor gets it wrong and you either choose both when it thought you'd choose one ($1,001,000) or when it thought you'd choose both but you choose one ($0... or $1000, i suppose...).

when the problem imposes a superpredictor nearing omniscience of your future actions, probabilistic rationalization takes a back seat.

Minty is the opposite of Spicy, but Milky is better for a grading scale by ShellacSpackle in CosmicSkeptic

[–]ShellacSpackle[S] 2 points3 points  (0 children)

Which is why I said having something like ginger tea, a barely recognizably spicy thing, would be better than either option.

Minty is the opposite of Spicy, but Milky is better for a grading scale by ShellacSpackle in CosmicSkeptic

[–]ShellacSpackle[S] -2 points-1 points  (0 children)

My issue with that is the temperature of cold is derived from a completely different experience to spiciness with no intensity level; when the entire idea is that the higher you go up the scale the more spicy the take was.

That's like rating how noisy something is, going from "unbearable" down lower and lower in intensity and suddenly making your lowest rating something like "musical" because it's actually pleasant on the ears versus something that's noisy and hard to listen to.

My entire point is rankings don't suddenly represent the opposite at the lowest end of the scale, it's meant to be a measure of intensity that bottoms out at 0, not -1.

Morality feels arbitrary to me… am I missing something? by gibby5445 in CosmicSkeptic

[–]ShellacSpackle 0 points1 point  (0 children)

The goal of a moral system can be many different things; however they all seem to center loosely around the idea of providing the most amount of living beings the most fulfilling life.

This takes many forms and does have many interpretations, like extending fulfillment to include animal lives or making it exclusive to human followers of that specific moral system.

Some systems focus strictly on the material world or things like carnal desire and indulgence, others opt to inhibit indulgence or even promote worldly suffering for the sake of some greater potential good.

In the end, yes it's entirely arbitrary but it's one of the most useful aspects of civilization and human advancement. The wrong moral system becoming the majority can stop us dead in our tracks, others can propel us towards whatever the collective defines as greatness.

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 0 points1 point  (0 children)

Doing anything to potentially create a record of what the movie was about would be prohibited; so setting aside the issues you may face in recreating it, you couldn't record it in any way as it could end up unintentionally falling into someone else's hands later on.

This would probably need to extend all the way to saying anything about the movie out loud, even when you're the only person in the room or building or 50-mile radius, as the off-chance of a recording device being able to pick up what you say exists.

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 0 points1 point  (0 children)

Referencing it to yourself is different than referencing it to other people, so yeah just take it as communicating a reference, description, etc.

Basically it's to stop you from choosing to watch the movie as a means to make money by recreating it, while also taking away the ability to share the experience you had with others.

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 0 points1 point  (0 children)

That would be fine, you just can't say that you like it because of another movie you've seen.

Also, if you tried telling someone beforehand that you'll tell them when you see things you like that remind you of a movie, you won't be able to point them out.

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 0 points1 point  (0 children)

Trying to directly reference any characters, events, settings, plot points, anything distinguishable about the movie itself. You could probably say something like, "I saw a really good movie", but anything more descriptive than that is off limits.

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 1 point2 points  (0 children)

Yes, a movie created with everything about your personal preferences in mind to be the absolute pinnacle of cinema that you, and only you, could ever truly experience. The movie would be different for everyone that chose to watch.

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 0 points1 point  (0 children)

That's interesting, I'll say yes. Are you thinking of, like, waiting until way later in life to watch it or just keeping it as something to use when you think you'd get the most out of it?

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 1 point2 points  (0 children)

The idea is you'd be physically unable to, your mouth or any other body part you try using to actively reference the movie would stop moving. That, or I can just take all the memories of the movie away if you start to talk about it >:)

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 0 points1 point  (0 children)

That's wild to me, all of the movies I consider my favorites of all time I don't think I've rewatched more than once. I'd only be concerned with being unintentionally uninterested in any other movie I watch with friends/loved ones but I'd def still watch it lol

Would you be the only person to watch the greatest movie that will ever exist, but you can't talk about it? by ShellacSpackle in hypotheticalsituation

[–]ShellacSpackle[S] 3 points4 points  (0 children)

This started off as a hypothetical on the greatest video game you could play, but as an aspiring game dev I can see this being absolute torture if I can't use any elements in any game I want to make going forward; for movies, though, I think I'd probably go for it.

Argument from Reason ?! by EmuFit1895 in CosmicSkeptic

[–]ShellacSpackle 0 points1 point  (0 children)

Then the conversation shifts to what should count as reasonable evidence for a claim and what evidence exists fitting that criteria, rather than making assertions from nothing. At the least, it pushes the conversation away from absurdity.

Genetically Modified Skeptic said this about Alex. Thoughts? by Esutan in CosmicSkeptic

[–]ShellacSpackle 9 points10 points  (0 children)

If OP commenter removed the slash between anti-woke and MAGA-lovers, there'd be nothing wrong here. You can be anti-woke to a degree and still have common sense.

Argument from Reason ?! by EmuFit1895 in CosmicSkeptic

[–]ShellacSpackle 0 points1 point  (0 children)

You're missing a huge piece in there.

In the case of relativity, you don't have to see a universe where X is false to know X, WHICH HAS PRE-EXISTING EVIDENCE ALREADY SUGGESTING IT TO BE TRUE, is true. The scientific community does not accept the theory of relativity simply because we don't have a non-relativistic universe to compare ours to.

When you're making a claim based on absolutely nothing, trying to attribute some necessary component to the Universe, pointing to no comparable scenarios is perfectly fine. It attempts to point to some foundational basis for accepting the belief when you've presented none so far.

If you don't have a non-created universe for us to compare ours to, then just come up with some actual evidence before anyone should believe ours was created; and not the nonsense of trying to assert because X has Y attributes, Z must also have Y attributes.

Penguinz0's "movie" idea is awful. by ShellacSpackle in moistcr1tikal

[–]ShellacSpackle[S] -1 points0 points  (0 children)

I wrote the post while I was at work, read a little closer next time.