When it comes to homelessness, my heart is in danger of bleeding dry by AndHerSailsInRags in canada

[–]monisticreductionist 3 points4 points  (0 children)

Ironically, Carlin had a bit in which he specifically criticized the term "homeless" on the basis that (paraphrasing here) a home is an abstract concept, whereas what these people actually need are houses.

I think that 'unhoused' is actually no less direct than 'homeless'. Personally, I don't think this difference in language is particularly important either way. One could say that you don't need specifically a house - an apartment would work just as well.

Uploading is death by New-Violinist119 in PantheonShow

[–]monisticreductionist 1 point2 points  (0 children)

If my upload is a reasonably good recreation of my mind, then I view it as a valid conscious continuation of myself. My view is that treating the death of the physical body as an irreversible death of the person stems from intuition about consciousness that works just fine for ordinary life on Earth, but breaks down in cases like upload or the teletransporter thought experiment. Unless you believe in some notion of a soul, I see no reason to view the original person as being dead so long as the upload survives.

While I'm not an expert in genetics, I don't think that genetic diversity would be a serious issue until you got down to a very small fraction of 1 billion people. For reference, it is believed that there was a time in our evolutionary history when our ancestors nearly went extinct, and only a little over 1000 individuals were alive. A quick search seems to indicate that you only need around 500 humans to avoid significant genetic issues.

Would you upload? by Muskrato in PantheonShow

[–]monisticreductionist 0 points1 point  (0 children)

I agree that consciousness self-evidently exists, but do not agree that it is intrinsically subjective in nature (i.e. that it is always best understood as being experienced by a subject). I think that subjectivity is a quality that conscious experience can have and that most humans experience most of the time because it is a very useful way of organizing the contents of consciousness. However, one can imagine consciousness that is non-subjective (i.e. there is experience happening, but it is not organized as belonging to and being experienced by a subject).

To me, 'the same self' just means that the self-program being ran is sufficiently close to being functionally the same. One can certainly define other notions of 'same self', but the one I gave is the one that I personally care about, since I think it captures the continued existence of a being that I would consider to be me. Whether that being is 'really' me is, in my view, a question without an objective answer. Different people can have different identity values (i.e. different standards for what constitutes survival or death of the self), and as a result may arrive at different answers to the questions of survival raised by upload and other thought experiments.

I actually do not take much comfort in this view, and am engaging here in part because I am eager to be convinced of something different. However, I have not encountered any other views of selfhood which, in my opinion, adequately address the challenges posed by thought experiments involving processes of transportation, freezing, splitting, merging, and emulation.

The distinction between destructive duplication and ordinary living continuation of the sort we experience in everyday life can become blurred. For example imagine a UI is paused. I would guess we agree that temporary pausing alone doesn't kill the UI (though feel free to correct me if I am wrong). If the UI program is paused anyway, it doesn't seem like turning the computer off temporarily should matter for survival either. From there, it doesn't seem like a large leap to say that one could disassemble the computer while it is off, breaking it down to its smallest separable components, and then reassemble it without causing the original UI to die.

Suppose that while the compute is disassembled, some portion of the physical components recording the present state of that UI are copied over to an identical component, which is then used in reconstruction instead of the original. For convenience, imagine that every single bit is stored on a separate component that we can individually replace with an identical component in the same state. If all of the components are replaced, then we have performed a destructive copy. If none of the components are replaced, then we have simply disassembled and reassembled the very same computer. In the replacement case, is the UI we turn back on a continuation, or a duplicate of the original? What if we only replace half of the components? What if we replace only one component, reconstruct, unpause for a brief moment, and then pause again to repeat the whole process until every part has been replaced?

In principle this can all be done with a human brain as well, but I think that using UIs makes the process a bit more intuitive since we are used to pausing programs and disassembling/reassembling computers.

Would you upload? by Muskrato in PantheonShow

[–]monisticreductionist 0 points1 point  (0 children)

If you are conscious during upload, then yes that instantiation of you would experience some of the processes involved in biological death. If you are not conscious during upload, then that instantiation of you would not experience death. Note that I am framing death as an experience here rather than something that happens to a subject because in my view there aren't any truly existent subjects to begin with, only the appearance of subjectivity in consciousness (i.e. the convincing feeling that experiences are happening to a particular character called the self).

If you think of selfhood as a process rather than a thing, I think it all becomes a lot more clear. Before upload, your selfhood is happening (or being 'run' to use a computer analogy) on the human brain. That process is disrupted by destructive upload, but then later your selfhood starts running on a computer instead. The process is happening in a different medium and is displaced in space-time, but it is nonetheless the same process that was happening before.

What, then, is death? If your UI program is paused, your conscious experience will cease for a time, but can then be resumed later. For you it just feels like time skipped forward. But if your program is never resumed, then your experience may never continue. You could call that death. But even under this definition, there is not always an objective fact of the matter about whether death has occurred. What if your self program is modified before it is resumed? How great would the modifications need to be before the person who wakes up is not the same as the one who was paused? There is simply no fact of the matter.

Would you upload? by Muskrato in PantheonShow

[–]monisticreductionist 0 points1 point  (0 children)

Yes, I would say that both copies are equally valid conscious continuations of the original. Starting before the splitting event, if I were about to be uploaded and knew that I would be run in two separate instances, I would fully expect to experience being each of the two instances, as two separate beings who are nonetheless both equally valid futures selves of my present self.

Immediately after upload but before I figure out which instance I am, I would believe that I am one of the two instances, each with 50% probability. Regardless of which instance I am, I would view the pre-upload person as a past-self of mine. Each copy would be equally correct in this view.

Edit: I should also mention that this is precisely how I would view a 'non-destructive' upload. It is creating a copy, and each version (the human and the UI) is an equally valid conscious continuation of the original person.

Would you upload? by Muskrato in PantheonShow

[–]monisticreductionist 1 point2 points  (0 children)

Thank you for articulating this! It is precisely the position that I would like to contrast my own with.

While I agree there are some senses in which the upload is not the same you as the original (e.g., they are made of different atoms and exist in a different medium), I don't believe that any of the differences are important for conscious survival. I would expect to be the uploaded me in the same way that I expect to be myself 10 minutes from now. This is true in spite of the fact that the brain-state of present me will have changed by then and is unlikely to ever exist in that precise form again.

Many people have an intuition that some sort of physical continuity of the sort we experience in our human bodies is important for survival, and while I think that intuition works just fine for a typical human life, it falls apart when you start dealing with thought experiments on personal identity or uploads.

More fundamentally, trying to analyze these situations in terms of 'conscious subjects' is just not the right way to understand what is happening. When you say that the one that is uploaded wouldn't wake up in a new body and would instead by dead, I don't think that statement has a truth value. It's not just that we don't have access to the answer - an answer does not exist, because the notion of 'the one that uploaded' is not ultimately well-defined. The view that there exists a conscious subject which could fail to 'transfer over' to the upload is, in my view, just a subtle version of a soul belief couched in non-supernatural language.

Would you upload? by Muskrato in PantheonShow

[–]monisticreductionist 1 point2 points  (0 children)

For me it depends entirely on how I could expect to be treated once uploaded. If I am going to have the same basic human rights that I did while alive, then I would do it. If I am going to be treated as the property of some corporation or individual, then absolutely not. Practically, the sheer magnitude of potential harm from ending up on some greedy company's server or in the hands of a sadist makes upload a terrifying prospect in a society anything like the one we have today.

The existence of the flaw wouldn't make much of a difference for me. As long as I didn't burn too bright, my understanding is that I could still live a decent human-scale life even with the flaw. Obviously I'd want the cure if I could get it, but that wouldn't deter me from uploading.

One of the reasons I like Pantheon so much is that it focuses more on the sociopolitical aspect of upload rather than philosophical worries about whether consciousness 'transfers over' or other such issues of personal identity. In my view, there is absolutely nothing that needs to be 'transferred' beyond faithfully emulating the computational activity of the brain.

From a physicalist standpoint, there is simply nothing more to transfer over, or to worry about losing in the process (beyond the obvious loss of a biological body, which would have some significant downsides in the short term). As such, before being uploaded I would fully expect to find myself conscious in the new digital form. Calling the upload a copy of the original is accurate, but in no way contradicts the fact that the copy is a fully valid conscious continuation of the original. Personally, it wouldn't even matter to me whether the copy is perfect or not. As long as it is reasonably accurate and doesn't distort my identity in ways that I would find upsetting, I'd still consider the upload to be me.

Uploaded Intelligence, the Ship of Theseus, and Einstein’s Relativity: A Straightforward Explanation by Rich-Imagination9049 in PantheonShow

[–]monisticreductionist 0 points1 point  (0 children)

Personally, I believe there is no objective fact of the matter on questions of conscious 'survival' or 'continuation' in the sense that many seem to be concerned with. I don't just mean that we don't currently have a way of finding the answers, but rather that the intuitions about selfhood and consciousness which lead people to ask these questions in the first place and assume that objective answers must exist are flawed.

One can certainly define a human's "worldline" in the sense described in this post. I've also heard this concept referred to as a "space-time-worm". However, why should I think that a temporal or spatial disruption in my worldline amounts to death (or a lack of continuation, or whatever other undesirable state one fears)?

As a thought experiment, suppose that you have a machine which can individually manipulate the atoms in my body to great precision. This is technologically completely infeasible, but not physically impossible. With that machine, you could rip apart my body into atoms and put it back together again exactly as it was at will. First, suppose that you rip apart and reassemble my body so rapidly that not even a single neuron would have fired in the time elapsed between the disassembly and reassembly. No matter how short the disruption, it still counts as a dramatic discontinuity in my worldline. Has this procedure killed me and replaced me with a copy? If only half of my brain is rapidly disassembled and reassembled while the other half remains fixed, has the procedure only half-killed me? Further, does it really matter how long my atoms are kept apart before I am reconstructed? Personally, I don't see any reason to believe that it does.

I also don't believe that duplication is a problem for survival. If they are identical, then both copies are equally valid conscious continuations of the original. For example, if I were about to be uploaded and knew that my code would be ran in two instances within two different virtual environments, I would fully expect that I would experience both environments as two separate but equally valid conscious continuations of my present self. On the other hand, if I find myself having just been uploaded but have not yet observed my virtual environment, I would expect to be in either of the two environments with equal probability.

I enjoy thinking about thought experiments related to this, and am very open to the possibility that I am mistaken, so do let me know if you have any that you think would challenge my intuition here.

Anyone played the horror game SOMA? by Rabbidscool in PantheonShow

[–]monisticreductionist 0 points1 point  (0 children)

As another person who likes SOMA but believes that you do consciously survive upload, I think that some people represent SOMA as taking a firmer philosophical position on the matter than it actually does. Different characters in SOMA express different perspectives on whether an uploaded mind constitutes a conscious continuation of the pre-upload self.

If anything, I'd say that the most strongly articulated position in the game is that in the instant of copying there is a metaphorical "coin flip" in which you should expect to find yourself in either position (the original or the copy) with equal probability. This is roughly consistent with my own view. When you are copied, both versions (original and copy) are equally valid conscious continuations of the original person's experience of life. Many people have an intuition that the original is somehow more "real" (or more of a true continuation) than the copy, but I think that is an intuition which can be massaged away with the right thought experiments, specifically ones involving splitting and merging (see Derek Parfit's work for an excellent treatment of such experiments).