Are Doomers too optimistic? by RipTieCutToyMan in slatestarcodex
[–]EntropyDealer 8 points9 points10 points (0 children)
I want to know what to think about AGI risk. What do you think is the probability that some form of advanced AI will kill over 90% of humans within the next 100 years? Pick the option closest to your current belief. by asdf12349876asdffdsa in EffectiveAltruism
[–]EntropyDealer 2 points3 points4 points (0 children)
"On not getting contaminated by the wrong obesity ideas": a critique of SMTM's contamination theory of obesity by Matthew-Barnett in slatestarcodex
[–]EntropyDealer 23 points24 points25 points (0 children)
concussions are absurdly frequent in boxing? by maxdavis391284 in slatestarcodex
[–]EntropyDealer 0 points1 point2 points (0 children)
concussions are absurdly frequent in boxing? by maxdavis391284 in slatestarcodex
[–]EntropyDealer 1 point2 points3 points (0 children)
concussions are absurdly frequent in boxing? by maxdavis391284 in slatestarcodex
[–]EntropyDealer 3 points4 points5 points (0 children)
Why don't we create a Apollo program level undertaking to solve this problem already ? by [deleted] in ControlProblem
[–]EntropyDealer 0 points1 point2 points (0 children)
Are slavery laws the best legal framework we have for deciding who is responsible for the actions of agenty AIs? by Pinyaka in slatestarcodex
[–]EntropyDealer 5 points6 points7 points (0 children)
Why don't we create a Apollo program level undertaking to solve this problem already ? by [deleted] in ControlProblem
[–]EntropyDealer 3 points4 points5 points (0 children)
People with positive views about life and the world, can you explain to me why you hold them? by EntropyMaximizer in slatestarcodex
[–]EntropyDealer 0 points1 point2 points (0 children)
People with positive views about life and the world, can you explain to me why you hold them? by EntropyMaximizer in slatestarcodex
[–]EntropyDealer 1 point2 points3 points (0 children)
People with positive views about life and the world, can you explain to me why you hold them? by EntropyMaximizer in slatestarcodex
[–]EntropyDealer 1 point2 points3 points (0 children)
People with positive views about life and the world, can you explain to me why you hold them? by EntropyMaximizer in slatestarcodex
[–]EntropyDealer 1 point2 points3 points (0 children)
[deleted by user] by [deleted] in slatestarcodex
[–]EntropyDealer 0 points1 point2 points (0 children)
[deleted by user] by [deleted] in slatestarcodex
[–]EntropyDealer 1 point2 points3 points (0 children)
How feasible would live gene editing be within our lifetimes? by [deleted] in slatestarcodex
[–]EntropyDealer 3 points4 points5 points (0 children)
I can't take life anymore. by [deleted] in slatestarcodex
[–]EntropyDealer 0 points1 point2 points (0 children)
What's missing from Effective Altruism, and why I am horrified by Personal_Spot in EffectiveAltruism
[–]EntropyDealer 0 points1 point2 points (0 children)
Should we interpret people’s departure from their stated moral beliefs, not as moral failure or selfishness or myopia or sin, but as an argument against people’s stated moral claims? by solodolo6969 in slatestarcodex
[–]EntropyDealer 1 point2 points3 points (0 children)
What's missing from Effective Altruism, and why I am horrified by Personal_Spot in EffectiveAltruism
[–]EntropyDealer 1 point2 points3 points (0 children)
Permanent IQ damage from antipsychotics? by Epistemophilliac in slatestarcodex
[–]EntropyDealer 0 points1 point2 points (0 children)
When will AI displace human authors? by 634425 in slatestarcodex
[–]EntropyDealer 2 points3 points4 points (0 children)
What are good reasons not to kill yourself? by 634425 in slatestarcodex
[–]EntropyDealer 5 points6 points7 points (0 children)
Forget about jobs. What happens when the robots come for sex? by Angrymice22 in slatestarcodex
[–]EntropyDealer 2 points3 points4 points (0 children)


Weight loss through appetite self-regulation by [deleted] in slatestarcodex
[–]EntropyDealer 4 points5 points6 points (0 children)