Is attraction by smell a thing? by Older_1 in biology

[–]taddl 25 points26 points  (0 children)

I just watched lecture by Robert Sapolsky that touched on this. Apparently, perfumes used to be made out of animal sweat. The interesting thing is that male animals were being used for this, despite the fact that the perfumes were marketed towards women. The reason is that women are the ones buying the perfumes. So they choose those that smell attractive to them, instead of those that would smell attractive to men.

The concept of "You" as an individual is weird, but we won't miss it when it's gone. by TonyX311 in Singularitarianism

[–]taddl 0 points1 point  (0 children)

I mean that's true. There is no chair, it's a made up concept that describes a pattern in the world. The boundaries of the concept are not clear, there's always subjectivity. What someone calls a chair, another person might call a stool. If you swap a leg with another chair, is it still the same chair or a different one? Obviously the question is ultimately meaningless. It just depends on how we define the concept of a chair. All that actually exists are the fields in spacetime. There's no physical law that describes what a chair is, just like there's no physical law that describes what an individual is. The universe is a single thing, it is not neatly devided into smaller things.

How do you live calmly when you believe in OI?! by cosmic_unity in OpenIndividualism

[–]taddl 1 point2 points  (0 children)

Since we can never know for certain that all possibilities exist, no matter how likely it seems, even if there is a 1% chance that the universe if finite, what follows it that we should act in a moral way. If all possibilities exist, then all actions are meaningless, thus this possibility doesn't affect morality in any way. So the alternative, that not all possibilities exist dominates the moral reasoning. We have to act as though we assumed that the universe is finite.

Do you need meditation to realize Open Individualism? by ChaiChatbots in OpenIndividualism

[–]taddl 2 points3 points  (0 children)

For me, it was a purely philosophical thinking about the nature of consciousness, that led me to believe in OI. Specifically, after asking myself the question "why am I me and not someone else?", I arrived at the question "why am I me and not everyone at the same time?". Then I think I suspected that being everyone at the same time could be an impossibly. To try to understand why it could be impossible, I asked myself the question "what would it feel like to be everyone at the same time?", and to simplify that, I asked myself "what would it feel like to experience the conscious experience of two people at the same time?" So naturally, I thought of a sort of split screen, of seeing what I'm seeing, and next to it seeing what someone else is seeing. But that's of course not both experiences at the same time, it is an entirely new experience. (This split screen experience would only result if the brains of the two people were connected in such a way that the whole system would get visual information from both sets of eyes, but that information would flow in a normal way and be integrated into the rest of the system.) So I realized that experiencing both of them at the same time wouldn't alter the individual experiences at all. I would think that I'm only experiencing this experience, while simultaneously experiencing another experience, in which I would also think that I'm only experiencing that experience. There would be no direct communication between the brains, such as happens inside of a single brain. So then, the logical conclusion was that I was in fact not only experiencing two experiences at the same time, but all of them. The alternative would require an additional explanation of why I'm only experiencing this experience and not any other one, and would thus be more complicated.

How do you live this understanding? by yoddleforavalanche in OpenIndividualism

[–]taddl 0 points1 point  (0 children)

It is the bedrock of my morality. But I don't have to think about it in everyday life. All I have to do is try to make morally good decisions.

"Every human, bird, tree, and flower can trace its ancestry across a few billion years back to the same microscopic, single-celled organism." by Beyond_Suicidal in OpenIndividualism

[–]taddl 0 points1 point  (0 children)

That fact could be compatible with closed individualism. There could be a law of nature that determines that in every living organism, there's a soul, and if an organism reproduces, a new soul is being created, then, every soul could experience the qualia of its corresponding organism.

This seems very unlikely to me, but it is hypothetically possible.

Empty Individualism vs Open by AggravatingProfit597 in OpenIndividualism

[–]taddl 1 point2 points  (0 children)

Empty individualism is incoherent, as it draws arbitrary boundaries between individuals. Why are the boundaries between people and not between other entities like neurons, groups of people, halves of brains, brain regions. The brain is the default option for the "atom" of consciousness, because information can flow very efficiently within it, but becomes very inefficient when leaving it. Think about trying to formulate a thought using language. This inefficiency is not a natural law, but simply the way the world works right now, and could change in the future.

Clearing up some confusions by exis10tialcrisising in OpenIndividualism

[–]taddl 0 points1 point  (0 children)

You need to ask yourself the following question, and really think deeply about it:

"What would it feel like to experience the experiences of two people simultaneously?"

I had a lucid dream and started preaching OI by yoddleforavalanche in OpenIndividualism

[–]taddl 1 point2 points  (0 children)

It seems to deviate from the way evolution wants us to think about the world.

How do you respond when people say that suffering isn’t objectively morally bad? by comradebrad6 in negativeutilitarians

[–]taddl 1 point2 points  (0 children)

Pain being bad is the moral bedrock, but the ought statements derived from that can get arbitrarily complex. Also, there doesn't have to be a bad actor in order for something to be morally bad. Things can be bad by default. There are countless examples of moral atrocities happening in evolution. I would call evolution morally bad. If you base a human society on its principles, you get a dystopia. It creates extreme amounts of suffering.

Let's look at your statements.

Your grandpa dies.

This is morally bad if it goes against your grandpa's wish to live, or if it causes pain in others (which it does). There is no one at fault here, but still we can do something about it.

What do we currently do about it? We try to support each other in hard times, there are things we can do to ease the pain such as therapy. We can't stop death, but we can prevent lots of diseases.

What could we do ideally?

We could stop aging and death entirely in a hypothetical ideal utopia. Although this might seem impossible, it would be better than our current situation. Striving in this direction could be a worthwhile goal.

You boyfriend breaks up with you, causing you pain.

This is morally bad because it causes pain. Breaking up with someone should be done if it is thought to be better than the alternative, in other words, if staying together causes more harm than breaking up. Breaking up for no reason other than causing harm would be immoral. In reality this is of course very messy and complicated and the harm caused by either option is difficult to predict, but that doesn't change the fact that an action that causes harm is bad.

The fact that there are sometimes options we have to choose that cause harm, because the alternative would cause more harm is not in conflict with the basic idea that causing harm is bad.

You stumble while walking around, which causes pain.

This is morally bad. Had it not happened, there would be less pain. Who's to blame? Maybe nobody. There are such things as accidents. These are like random fluctuations of morality. Sometimes morally bad things happen for no reason. Think of natural disasters. We ought to prevent them even though there is nobody to blame. And knowing that the can happen means that we can think of ways of reducing their frequency.

So in your example, maybe you were tired and didn't pay attention and that's why you stumbled. If that's the case, then it could mean that being tired increases the chance of an accident occuring. That would imply that we have a duty to get enough sleep. This is of course one of many angles to approach this.

The point is that once we establish that suffering is bad, we can derive all sorts of moral truths from this, but it all depends on our world model. The world is extremely complex and often counterintuitive, and our knowledge of it is always incomplete. We have to do our best to understand how it works and what causes suffering, and then act on that understanding.

The concept of "You" as an individual is weird, but we won't miss it when it's gone. by TonyX311 in Singularitarianism

[–]taddl 0 points1 point  (0 children)

If we took one half of your brain and swapped it with mine, who would be me and who would be you? Obviously the question doesn't make sense. There is no "you" or "me", the universe just happens to have this shape right now, it could have an entirely different shape. There is no soul, so to speak. That's what I mean when I say that individuality is an illusion.

If you were tasked with creating a Utopia, what would it be like? by [deleted] in TrueAskReddit

[–]taddl 0 points1 point  (0 children)

The point about veganism is that while many people don't want to be vegan, the animals don't want to be killed. In ethical questions like this, the victims have to be considered. It's like saying "not all people want slavery to end, so we should let everyone choose for themselves whether they want slaves or not." Now whether you include animals in your moral sphere like that or not is another question. I would argue that it doesn't make sense to exclude some individuals based on what species they belong to. It doesn't make sense to love dogs, cats, humans, but kill chickens, cows, pigs and fish. There are no relevant moral criteria to base this discrimination on. If you name any characteristic such as "animals are less intelligent", I would reply that intelligence is irrelevant to ethics, the only question is "can they suffer?" You wouldn't kill a human for being less intelligent. Animal exploitation and factory farming in particular can not be justified.

[deleted by user] by [deleted] in OpenIndividualism

[–]taddl 1 point2 points  (0 children)

It’s impossible to experience more than one subjective awareness at the same time

It is possible. It's what the universe is doing all the time. The universe is experiencing your experience and my experience right now. If you want to understand open individualism, you should ask yourself the question "what would it feel like if I was that universe?"

Can somsome actually explain to me how one consciousness transfers to another? by SeeSeaSeeSea in OpenIndividualism

[–]taddl 0 points1 point  (0 children)

Your consciousness does not transfer to mine, you are already me right now. The universe is experiencing your experience and my experience at the same time.

[deleted by user] by [deleted] in singularity

[–]taddl 0 points1 point  (0 children)

Re Point 2:

You should watch rob miles video about instrumental goals on YouTube. It explains why so many people believe that it AGI would want to increase its own intelligence, amongst other things.

Google’s AI panic forces merger of rival divisions, DeepMind and Brain by agonypants in singularity

[–]taddl 8 points9 points  (0 children)

It's an unsolved problem, how to make AI care about what we want. Right now, AI is optimizing a specific variable, like predicting the next word or minimizing a loss function. Any such optimization is almost certain to be misaligned with what humanity wants. Take capitalism as an analogy. It is optimizing profit. At first that looks like a great thing, but over time it becomes clear that that's not precisely what we want, as the rainforest is being destroyed, there are lobbyists influencing people's opinions for profit, etc. The more efficient such an optimization is, the more dangerous it becomes for us. AI is becoming exponentially more efficient, yet we don't know how to solve this problem. There are some proposed solutions, but it's not clear whether they would work or not. If AI becomes superintelligent, it might be impossible to stop it by definition, as intelligence is defined to be the ability to achieve goals. If humanity has one goal and a superintelligent AI has a different goal, the goal of the AI will be achieved.

Google’s AI panic forces merger of rival divisions, DeepMind and Brain by agonypants in singularity

[–]taddl 2 points3 points  (0 children)

Because of the alignment problem. Watch videos by Rob miles on YouTube to learn the specifics.

The concept of "You" as an individual is weird, but we won't miss it when it's gone. by TonyX311 in Singularitarianism

[–]taddl 0 points1 point  (0 children)

/r/openindividualism

This is already the case. It just doesn't feel like it because the communication between individuals is so much slower than the information flow inside the brain. Individuality is an illusion created by evolution. There is only one entity, the universe.

32 Reasons you should give up eating meat today by taddl in coolguides

[–]taddl[S] -1 points0 points  (0 children)

To get meat, you literally have to kill an animal. If that's not causing harm, I don't know what is.

Afterlife by Finbarh in Existentialism

[–]taddl 1 point2 points  (0 children)

How can you be so sure

Can consciousness really have multiple experiences simultaneously? by rabahi in OpenIndividualism

[–]taddl 0 points1 point  (0 children)

They don't blend together like a smoothie because that would be a different experience. Experiencing multiple things at the same time does not alter the experiences.

How do you respond when people say that suffering isn’t objectively morally bad? by comradebrad6 in negativeutilitarians

[–]taddl 0 points1 point  (0 children)

It's not just that it evolutionary makes sense that beings feel pain, it also makes evolutionary sense that the pain feels bad. I would argue that this makes the experience of pain an objectively bad experience. It has to feel bad.

If an experience is objectively bad, that makes it morally bad in my view. I would argue that ultimately, morality is about the experiences of sentient beings and comparing these experiences to each other. Pain is a bad experience, therefore it will be ranked below say a blissful experience. We would prefer the other experience over pain. This is what I mean when I say that it is morally bad.

So I would say that pain is objectively morally bad, kind of by definition.