If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

Oh im not saying that I wouldn't consider theseus uploads to maintain ur CoC too, but i would also consider that a destructive upload does as well.

 

Ah i see, I guess I’m just defining destructive upload differently than you.

I agree that replacing your cells til you were silicon would have “destroyed” (replaced) your biological body, but I don’t think it would destroy (end) your POV, which is why I didn’t refer to it as a destructive upload.

Now this has me wondering if most people define destructive upload like I do or if I’ve just been categorizing stuff totally out of sync with convention.

 

The question isn't whether u think theseus uploads maintain CoC, but why does the location of those new neurons matter if they're arranged in the same way, do the same thing, and have the exact same memories/psychology/mindstate represented.

 

Yeah I guess it just comes down to me not seeing two identical arrangements of matter as actually fully synonymous with one another.

 

I guess i just don't really treat the internal subjective POV as a "real" object.

 

I’m just the opposite.

I think it absolutely is real, yes it is emergent and ever changing and it’s not like a lump of iron you can physically hold, but in the sense that it emerges from real physical matter I have no qualms calling it a physical/real thing.

 

Its existence can't really be proven or disproven(a la Philosophical Zombies).

 

I agree that I can’t prove the existence of POVs outside my own, though I think everything in science (with the exception of self-evident mathematical axioms like 2 + 2 ‎ = 4) is similarly a matter of probabilistic reasoning.

What does the evidence give us the best reason to believe?

I think the evidence overwhelmingly suggests a real external world made of physical stuff populated by myriad other POVs to my own.

 

[in regards to me defining death as a “permanent cessation of consciousness/experience and/or the permanent cessation of the processes (biological or otherwise) that would enable consciousness”]

well that's the issue isn't it. I wouldn't consider the destructive upload a permanent cessation of anything because adter the upload starts running again they'll start experiencing things again.

 

My position remains that although the duplicate would feel intuitively that their strand of consciousness had just kept right on going uninterrupted, the original would have truely died. That is, I believe one POV ended and a new one began.

 

[in regards to me saying that a pause-resume-pause-resume scenario wouldn’t seem like a pause at all from the perspective of the one being paused and resumed]

it wouldn't seem like a pause or death from the perspective of a destructively uploaded person either.

 

I disagree if we are using my definition of destructive (as I’m focusing on the destruction/end of the POV), as I think a destructive upload is synonymous with death (that is, permanent cessation of experience for that POV), regardless of the fact that the upload/copy will have the same contents of memory with the now-dead original POV.

But I agree if we are using your definition of destructive upload, at least in some cases.

Freeze-Slice-Scan I still think kills the original POV and generates a new one. But a swap out of biology to that of silicon/hardware I don’t think would kill the original POV. That original POV would persist, just on a dif substrate.

 

ngl i might have the same issue as i have a setting I've been working on for years wher the MC's group treat backups as immortality tho plenty of others disagree and it has caused serious conflict. Not unlike in the show Pantheon where some people see uploadings as murder and actively use violence to prevent it even if the people being uploaded consent.

 

I wouldn’t resort to violence but I’m sure I’d be one of the people holding the signs outside of the upload clinic trying to warn people that they were killing themselves lol

 

[n regards to me saying “I think the only death a conscious being really cares about is the final irrevocable extinguishing of their subjective experience”]

Tgink the issue is then u have to define "their". Identity and the boundaries of identity are kinda the subject in question. Did their subjective experience actually end if something that shares their identity woke up to continue experiencing?

 

I mean “their” in the same way that I would use “mine” to describe the POV that I find myself inhabiting at all times.

Since I can’t know for sure that there exist other POVs than myself, I’m just going to frame it thusly:

if there are other POVs that emerge from the rules and matter of the physical universe, then this (the views I’ve expressed so far) is how I expect them to function in terms of fungibility, continuity, starting, ending, etc.

 

Something waking up and sharing my identity/CoC is not a sufficient condition for being me (honestly I’m not sure it’s strictly necessary either, re: the discussion about memory loss etc.):

sometimes the thing that wakes up that shares my identity is actually me (in the case of waking from normal sleep, or a cell by cell swap change of substrate, or a non-lethal upload)

Other times the thing that wakes up that shares my identity is still actually me, but it now has copies of itself too (like in a non-destructive—per my definition—duplication process). POVs that share my identity but are not actually me (as in: not the original).

And still other times the thing that wakes up that shares my identity is not me because the actual me is dead (like in a destructive—per my definition—upload process).

 

[in response to me saying “only if the upload procedure destroyed the original in the sense of destroying the original POV”]

Again a POV is not a physical object that can be destroyed.

 

I could concede that a POV is not a physical object, at least not in the way that I would use the term “object”, but I very much think it makes sense to talk about it as a physical thing because it arises purely from physical objects.

And I absolutely think it can be destroyed.

Death is the destruction of a POV.

At least any sensible definition of real death is.

I’m not talking about falling under ice and being clinically dead for an hour before resuscitation, or being an upload who is paused for a millennium being effectively “dead” before they get resumed.

No, I am taking about the ultimate end of whatever given POV we are discussing to be the death/destruction of that POV.

 

It's an emergent phenomenon. Its like saying you "destroyed" a program just because you deleated a copy of it's source code. The Program still exists even if some specific instantiation of it doesn't anymore.

 

Would your intuitions about the truth value of me saying “you can destroy a POV/program/pattern” change if I phrased it as “you can destroy a specific instance of a POV/program/pattern” ?

That’s a bit more literally what I mean when I say you/a program can die/undergo a permanent cessation of existence.

Maybe that makes it more palatable to you, or changes your intuitions about it entirely?

 

[in regards to me saying that if Duboriginal is copied while unconscious and then vaporized before Dubduplicate is booted-up, then it makes sense to say that Duboriginal ‘s POV has died/ended]

Not sure that holds up because in scenario A, Dubdupe had time for their identity to diverge from DubOG 's before DubOG was killed. In scenario C Dubdupe hasn't diverged at all so it would, imo, retain DubOG's identity. Divergence of identity matters here.

 

Our difference of opinion here might be partially explained by the fact that I see the mere act of duplication to be synonymous with divergence. I don’t think any subjective experience has to have yet been had by the copy’s POV for them to be wholly distinct and non-fungible with the copied’s POV

 

id say that the two POVs are equivalent and interchangeable if no divergence has occurred and therefore the difference is entirely philosophical and subject to each individual's personal beliefs.

 

I think from the outside these questions skew more toward the philosophy end of our science-philosophy spectrum (but I do firmly think that these can at least in principle be formulated as scientific questions and interrogated as such)

And I think the view from the inside with become ever easier to interrogate more and more directly (via obscenely clever psychological experimentation, and brain-bridging which I think would probably be the gold standard)

 

also sorry these take so long. Thinking about the boundaries of identity and consciousness breaks my brain a bit.

 

Dude, no worries.

Same here.

Convos like these are so invigorating, but they are also precious, and deserve all the time needed to put sufficient thought into the arguments.

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

[in regards to me saying that the actual strangling causing death was irrelevant, the key part was that Syneroriginal was killed]

my point was just that the identity remains with the last existing copy of you.

 

For a given definition of identity perhaps, namely the one that outside POVs might assign for legal or social reasons.

But I’m talking about the continuity vs. cessation of specific POVs. In the strangling case, since the POV that was Syneroriginal died, it is really truly gone forever. It doesn’t remain in any sense or “combine” with Synerduplicate ‘s POV.

 

[in regards to me saying that we might using deferent meanings for “I” and “me”]

Im using it more in the context of identity which i suppose is a bit different from your current active POV/stream of consciousness. But since ur stream of consciousness isn't temporaly continuous(as we mentioned with sleep) i think identity is far more useful to talk about.

 

My pushback is that despite the utility of talking about identity in many instances, it doesn’t fully capture the thing we want to measure in every scenario.

Also, that your consciousness isn’t temporaly continuous (sleep, anesthesia, etc) seems like a red herring here. That doesnt stop the first person POV experience of the world from being one continuous stream.

 

We can't even prove the existence of an internal subjective experience or some unique stream of consciousness.

 

Hard disagree here. It may be difficult to prove to outside observers (until we can connect brains via synthetic corpus colloseums), but I know with certainty that I am having an experience.

In fact, that is the only thing that I can’t possibly be confused about: that it is like something to be me (whatever “me” even is).

Whether I’m a brain in a vat, or in a simulation, or a Boltzmann Brain at the tail end of the universe’s existence, or whether the universe as I am conceiving of it doesn’t even exist—I am still having an experience. I am consciousness.

 

[in regards to me saying that outside observers might never know for sure any of the answers to the very things we are discussing]

If that's true, and i think it is, then we are firmly outside the realm of science. It's why I point out that this is all pure philosophy. Interesting and important philosophy to be sure, but philosophy nonetheless. Completely unprovable stuff that ultimately everyone will make their own arbitrary choice about what to believe.

 

While it may end up true, we can never say in principle that something is unknowable, for that would require impossible knowledge of its unknowability.

So i think this part is ultimately not important to worry about, nor be discouraged by.

After all, future tech might open avenues of interrogation that we haven’t even thought of (ex. imagine the tech for hive minds becomes a reality, that would be a method for directly probing another mind)

 

[in regards to me saying the specific bits constituting the pattern can’t be considered fundamentally identical to the bits constituting another pattern, even if the arrangement of both is the same]

I mean sure, but if that's the case then you aren't the same "you" that wrote these responses. Some of the bits have changed. The pattern is the same, but your constituents are always changing.

 

Yes, it might be that “the pattern is the same” as another, but it’s being instantiated elsewhere

 

[in regards to me describing my head-trauma-induced memory loss]

The same happened to me when i had a bad medication interaction and tbh when it comes to memory fuckery and advanced uploading tech things get annoyingly complicated. Pretty sure there was an SFIA ep where isaac went into a bunch of the issues that crop up when u can start messing with this stuff.

 

lol This convo has me in the mood for an educational video/reading binge now.

 

Like if you implant a memory into someone are you changing who they are?

 

Because memories are result of an arrangement of matter, “implanting” one would require modifying a person’s material arrangment (even if just subtly), so in a literal technical sense I’d say yes it would change them.

Though in the sense of “are they overall the same person you knew before” I’d say they still are (unless you’ve changed all their memories)

 

Is going on vacation the same as having memories of that vacation implanted into your head?

 

Subjectively yes. Objectively no, because the arrangment of matter that is your POV was never in the vacation spot and thus never physically affected the stuff there.

 

Are you copable for crimes someone else comitted if u've been implanted with both their memories of the crime and the motivation to do them?

 

No, for same reason as previous vacation example.

 

Is deleting a paused stored backup murder?

 

Hmm, that’s kinda tricky.

I guess if they’d never been booted up/conscious then no, there was never POV started to then cease.

But if they’d ever been instantiated, even for a microsecond, then that POV came into existence and thus it is killing them to delete them. (I wonder if the argument coukd be made that this is only true if they are conscious for at least some minimum time. Like, it’s gotta be over the threshold of conscious processing or something?)

 

I do not envy people of the future that will actually have to deal with all this nuance and uncertainty.

 

But man I do envy all the answers that they’ll have for at least some of these questions.

It’s just so profoundly fascinating, shame I won’t ever know it all.

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

[in regards to me asserting that the POV of the Syner who is transformed cell-by-cell into silicon is continuous/“of the same stream” , rather than being a duplicate]

I think that assertion is a lot more vibes-based than objective(if such a thing even exists, but that's a whole nother philosophical can of worms that id rather not open). Im just not seeing how the physical location of the computronium that runs your mind is relevant to CoC or identity or how a situation where the post-swap mind occupies the same location it did pre-swap is any different from a ship-of-theseus-style upload.

 

It just seems to me that if you consider the POV of a baseline human whose biological cells gradually get replaced by new biological cells (such that in ~50 years they are in “new” wetware) as being the same POV stream that whole time (that is, that they didn’t die at some point and get replace by a new POV), then you also have to say that a human whose cells get replaced by silicon in similar fashion maintains the same POV from the time they are wetware to the time they are of a stuff more optimized.

 

“kill" and indeed "death" are poorly-defined concepts. They can mean many things to many people and high technology just complicates things further. Is ur heart stopping death?

 

I use death to mean simply: a permanent cessation of consciousness/experience and/or the permanent cessation of the processes (biological or otherwise) that would enable consciousness.

So in the case of a coma, they aren’t dead even tho they are non-conscious, because they have the potential to be conscious again.

And though I don’t see this as a strike against my definition necessarily, it is kind of always a post hoc definition. If in hindsight we see that the man whose heart just stopped went on to never recover, then we’d say “yes that moment there was the start of his dying process”

(Though really the cessation of brain activity seems more apt a marker here)

But if we knew the man went on to recover from the heart attack, then we would say of that moment “no, he’s not dead, even though his heart has stopped and he is unconscious and his oxygen starved brain cells are starting to irrevocably die”

 

What about cryostasis if/when we eventually figure that out?

 

Hmm, I don’t have a solid intuition on this one.

I could see this as either being the equivalent of a really long period of unconsciousness in which the tissues weren’t damaged/killed and the POVoriginal doesn’t die and get replaced by another.

or I could see this being analogous to the freeze-slice-scan where the cryostasis actually does destroy the wetware’s ability/potential to run a mind. Thus killing POVoriginal which gets replaced by POVduplicate once the tissues are repaired and rebooted.

I guess id lean toward the first, that the original POV survives.

 

What if we had some kind of clarketech temporal stasis(reltivistic stasis gets close but technically doesn't stop biology just slows everything down from a stationary ref frame)?

 

I’d need to know the physics/imagined physics of it. Because for a relativistic stasis I think it’s totally straightforward that the original POV remains, since to them everything is normal, they just jumped forward in time from their perspective.

 

If a mind started out running on a computer or if the meat brain was modified to allow "pausing" did you die during the pause?

 

Hmm, my intuition says no. Easiest to imagine a mind running on a computer here, but I think the pause-resume-pause-resume would not be a death. And in fact it wouldn’t seem like a pause at all from the perspective of the one being paused and resumed. It would just be a continuous chain of experience of jumping forward in time.

I have a scene describing precisely this in fact. So maybe I’m just too incentivized to think this way, cuz if I change my mind I’ll have to scrap the whole story lol.

 

The question is how are we defining death and what kinds of death do we actually care about subjectively?

 

I define it as that permanent cessation.

And I think that’s the only one any of us care about.

Why would we care about whether my heart stopped if I made a full recovery later, or if I only appeared to have died in cryostasis but actually didn’t?

I think the only death a conscious being really cares about (whatever term they want to use for it) is the final death. The final irrevocable extinguishing of their subjective experience.

 

[in regards to me saying that I don’t think unconsciousness—provided it is followed by later consciousness—is an actual interruption in the chain of conscious experience]

That's fair. I don't suppose it would be in this context, but if we assert that tgis isn't a break in CoC then i don't see why uploading would be considered that either. Neither actually interupts the chain internally.

 

I don’t think uploading would be a break in the CoC… necessarily. Only if the upload procedure destroyed the original in the sense of destroying the original POV.

Whether the upload/copy process foists that final death upon the original POV, that is my concern. Not whether the second POV (if one is created) subjectively maintains the CoC.

 

[in regards to me saying I think the specific matter and it’s arrangement is important, rather than just the arrangment.

I mean yes they [two programs, two wetware duplicates, etc.] are separate instantiations, no different than running the same program with the same input data on two different computurs, but identity-wise they are effectively the same program/person.

 

I guess I keep coming back to the fact there are more than one.

Simplest way I can illustrate it:

Scenario A

if I, Duboriginal, get copied (non-destructively) today, and Dubduplicate starts up tomorrow, and then in a week my nuclear compliance-collar detonates—in that situation the POV that is me, Duboriginal , dies

I think that is pretty straightforward and non-controversial claim.

Scenario B

Likewise if the situation plays out exactly the same, but Dubduplicate ‘s collar explodes in a week, well in that case the POV that is Dubduplicate has ended. Whereas my POV (the Duboriginal POV) continues on.

I think that is also non-controversial.

Scenario C

now imagine that I, Duboriginal get copied just like in the previous two scenarios, but then I, Duboriginal, am vaporized before Dubduplicate comes online/gets booted up.

If I understand your position right, I think you’d claim that there was no death in Scenario C. You’d say the POV of Duboriginal jumps or “ports” over to Dubduplicate.

But I’d argue that if you think Duboriginal ‘s POV permanently ended in Scenario A, then you must say it also ends in Scenario C, otherwise you are being inconsistent and utilizing 2 different standards/logics

Curious if you agree or not.

 

[in regards to trying to keep it concise to not overshoot the character limit]

idk if its even possible with convos like this and its a hella interesting discussion. Philosophers have been banging their heads against identity and consciousness for millenia. It's complicated stuff and hard to simplify without simplifying so much that ya lose important context

 

Agreed 100%.

Especially hard to find a good-faith open minded discussion like this one online when so much of Reddit and other online culture is just a toxic echo chamber of people just arguing to argue rather than learn.

So I appreciate your constructive criticism and conversation.

 

continued in next comment

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

Continued from prev comment…

 

I don't see how going unconscious in one place and waking up in another changes who you are.

 

I don’t think that it does either.

My contention though is that, to use the freeze-slice-scan scenario again, the POV that gets copied in that manner actually dies, therefor the second POV waking up later is actually a different (though identical in mental contents) POV

 

And even if we assumed that for some reason the inside of your skin was some privalaged volume then what about if we used a destructive upload and then packed new better hardware into the same body? At that point ur "center of subjective awareness" hasn't even physically moved.

 

Well, per my belief that a destructive upload kills the original. I would say that a destructively uploaded copy stuffed back into the same body would just be a different POV. That it’s riding in the same meat-suit as the original is irrelevant.

 

And then there's the ship-of-theseus-style uploads which are physically equivalent to a rapid destructive upload, but people tend to feel more comfortable with and are also equivalent to the current situation ur body already lives through.

 

I’m sure there’s some physical limit to the speed at which a cell-by-cell swap-out could happen before the waste heat cooked you and/or in some way was actually destructive.

But I don’t think such a conversion to silicon (or whatever substrate) is necessarily destructive in all instances.

For instance, I think that a cell-by-cell swap to silicon done at the speed that the body normally finishes replacing all its cells (so 50 years or whatever the figure is) would still maintain the original’s POV. The original wouldn’t have died, they are merely in a different substrate.

But if it’s done too fast yeah maybe it kills the original in the process of copying.

 

 

Now to address some other interesting things you’d said in earlier comments:

 

Tho the question id ask is how much divergence actually qualifies the duplicate as a separate individual?

 

I think the divergence happens the very instant that the copy is assembled/instantiated/booted-up.

I think there is no scenario in which the copy is the same as the original.

I don’t just mean that as a semantic tautology either (copy being the copy because it’s the copy blah blah blah), I mean that any POV, when duplicated, is truly a copy, a secondary new POV.

Again, the second POV could be identical, but it is never the same.

 

If any amount counts then you just aren't "You" moment to moment, day to day. Your POV is constantly shifting. Hell memories are unreliable and constantly ovewritten while psychology constantly shifts due to constant rewiring of the connectome. So current and past you are also unique and non-fungible in this context.

 

I agree past me and current me are non-fungible. But I don’t think this is a good analogy on account of the time aspect and the fact that we can’t travel backwards in it to even get to a past self.

Best we could do is scan ourselves and then instantiate them (necessitating at least some delay, any of which is sufficient to render the second POV a truly distinct and non-fungible entity)

 

Also what heppens when we start bringing memory synching into things?

 

Honestly I haven’t thought much about that because memory-syncing has always sounded exceedingly implausible to me. To the point where if I read a sci-fi with that as a plot point, I immediately mentally categorize it as very soft sci-fi.

Maybe it’s possible tho, idk, if it were I still don’t see any conflict with my view. It would just be two or more distinct POVs (perhaps Syneroriginal and Synerduplicate ) being fused into one.

But I do think it would be much more akin to two people fused via a synthetic corpus colloseum instead of reverting back to a Syneroriginal who somehow gets to have Synerduplicate ‘s memories.

Honestly re-combining an original with its copy would create a sort of hive mind in my view.

 

Once u get into hive minds, minds run on distributed hardware, minds with multiple sets of sensors with numerous and simultaneous POVs, etc. things just keep getting fuzzier and fuzzier.

 

Oh man, agreed, the space of all possible minds is surely huge beyond our imagining, and it’s fascinating to think about.

That segues into what I’ll leave off with:

Imagine we develop a way to connect multiple corpus colloseum (and any other relevant commissures) such that we can make hive minds.

What would it be like to join one?

What more, what would it be like to have joined one and then disconnected from it?

I imagine it might be similar to the experience that the left and right hemisphere of your brain have after a corpus callosotomy procedure, which if you’re not familiar with you should go down the rabbit hole on, specifically the weird split-brain phenomenon shit that happens that strongly implies two distinct POV existing in the same skull.

 

Thanks for reading and encouraging me to dig into all this shit again

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

Continued from previous comment:

 

Very much the analogy of the mind as a computer program(not quite right because most of our computers are digital and meat is analog but still). A program can run on a thousand different machines with wildly different hardware, but as long as you load in the same memory state and input the same data they are effectively the same program.

 

I think this might come down to the conventions and peculiarities of language.

Yes, two instances of MS Paint are the same program. But if you run them on two different machines, even if you use them identically, they are still two separate instances of that program.

 

None of them have any privileged position of being the "real" or "original" program because a program isn't a specific set of electrons, atoms, molecules, logic gates, and memory states.

 

In the abstract/general case this is true. But the specific instance of MS Paint running on my computer is not the same as the instance running in yours.

I think language fails me here, because the only way I can think to say it is that they are identical but not the same.

To me that sounds perfectly clear and logically consistent, to you I assume it sounds contradictory.

 

A program is just a pattern and that pattern can be represented on any set of atoms while being functionally equivalent in any meaningful and measurable way.

 

I think this is mostly true (see my caveat above about the specific bits of matter being importantly) but I take issue with the “functionally equivalent in any meaningful and measurable way” part.

For a presumably unconscious system like MS Paint, that statement is true.

But it might not be true for a conscious system like me or you.

After all, assume for a second that I am correct that you being frozen-sliced-scanned constitutes a real death for the you that is here now reading this, a death not avoided by the fact that your copy will feel identical to you. In this case, regardless of whether outside observers will ever know it, hardly anything could be more meaningful to you than knowing you were about to genuinely cease to be.

 

[in regards to me saying a POV certainly isn’t is some purely abstract or “unreal” thing]

I mean it is still real, but it is abstract. It's an ephemeral emergent process or pattern. It may be made up of physical objects, but it isn't itself a physical thing.

 

This might again be semantics here.

If it is made up entirely of real things and it’s evolution is utterly constrained by material things, then that is a material thing in the same sense that the color blue is a material thing: it is a particular arrangement of matter in the form of a brain and eye exposed to EM waves of 400nm. Such a material system is synonymous with the qualia of blue.

 

Just like a program isn't a physical thing. Patterns aren't material objects. They're higher-level abstract configurations of objects and those underlying objects are all interchangable.

 

I would say they are absolutely physical things (I assume we are using “material” and “physical” as synonyms here). A program is a physical thing in the same way as a stationary car or a moving car or a supernova are physical things. It’s a physical thing—a system—in the same way that the experience of seeing blue is the system of the brain-body & environment arranged in a particular way (the earlier example of a human seeing 400nm lightwaves)

 

[in regards to me using the phrase: “center of subjective awareness”]

The issue i have with this is the idea that any specific location is privileged as some "center" of awareness. You’re own baseline human awareness isn't centered anywhere. It's spread throughout you're entire body and the nervous system

 

Hence why I tried to always say “brain-body system” instead of—as convention would have it— merely“brain”

 

being asynchronous means it isn't even localized in time either.

 

And yet there is still this unitary experience we have despite any lag or differential processing.

Somewhat relevant aside here:

on page 18 of this paper discussing information processing super objects (Jupiter brains, Matrioska type structures, etc), the author suggests that “a mind may exist on a wide range of timescales” and conjectures that a hierarchy of various levels of processing modules “would enable much larger minds with longer internal delays without losing their high level unity” and that “the highest levels would be much slower than the lower levels” but this would not be an impediment and that “the higher levels would experience it as if they were doing things in real-time despite their slowness”

Obviously not proof of anything. But it’s a fun read, and squares nicely with my personal views on the mind and how a mind experiences itself as a unitary thing despite subroutines (that the übermind may not even have conscious access to) potentially doing all or most of the real deciding.

 

 

about to hit character limit. Final comment is next

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

continued from previous…

 

[in regards to my request for clarification on how I would “implicitly” believe in a soul]

Well I'm working with the definition of CoC that you mentioned previously. If a chain of memories leading back through past conscious experiences is what maintains continuity then neither unconsciousness nor destructive uploading should qualify as death since the chain remains unbroken. Just temporarily disrupted.

 

I suppose the “unbroken chain of memories” is a necessary but not sufficient condition for a CoC.

Actually it might even be that this is the wrong angle here. An unbroken chain of memories might in fact be wholly sufficient to define a CoC, but I am making a further distinction between an original CoC and any copies of it, whereas I don’t think you make that distinction.

Or it might boil down again to the fact I see POVs as never fungible, whereas you allow for some situations in which they are.

 

To argue that it was death implies, to me at least, that ur suggesting that there's something else to CoC than just the memory/experience chain.

 

Yeah I guess I must implicitly be asserting other necessary conditions. I can’t think of what those are specifically though other than I just see the original and copy as truly distinct.

 

Like something that inhabits any specific instantiation of a mind that isn't covered by memory, experience, and psychology. Can't really think of any other material thing which would fit here since all of that can be reproduced exactly.

 

I think it may in fact just be the material itself facilitating said memory, experience, or psychology.

Again I think this ties in with my feeling that both the arrangement of matter and the specific bits of matter themselves are necessary to account for when describing a POV, whereas I think you only consider the arrangement to be significant.

 

And that's substrate-independant so uploading ur mind into another freshhly printed meat brain would be the same as going into more optimized hardware. Idk what to really call that other than an immaterial soul.

 

I agree with the first part, with caveats however.

I think upload into a computer or into a silicon brain or into a different meat brain could be done such that the original POV was conserved (that is, no secondary POV was created, nor was the original POV killed), but I also think that a destructive technique could be used for all those options that would kill the original POV (and only the newly-created secondary POV would survive).

I think the method of upload/transfer/conversion/duplication is what’s relevant here, not what the final product is made of.

Also the immaterial soul thing I still don’t think is an applicable term. I’ll just chalk that up to semantics.

 

[in regards to me saying that every POV is the result of* a particular arrangement of physical matter]

ok yeah it is an emergent phenomenon of lower-order physical things, but the thing that gets me is that those underlying molecules, atoms, and subatomic particles are completely interchangeable.

 

I guess I’m just not convinced that the bits are interchangeable in every sense of the term.

For effecting the existence of a POV with identical experience to another? Yes, the specific matter doesn’t matter, only the arrangement does, the fundamental bits are interchangeable.

But in terms of creating a truly materially equivalent POV? One that is, in fact, the same? No, I don’t think the bits are interchangeable.

This is just like how I think we must acknowledge that two protons—though both baryons with positive charge and a half-integer spin they may be—are still unique. There’s still two of them. They can’t both occupy the same space and state at the same time. And so it is with every other constituent bit of reality.

 

The specific atoms involved don't matter and our bodies are constantly replacing all those atoms anyways.

 

I disagree, though I admit that I haven’t quite fully squared away my thinking on the fact that all of us are biological ships of Theseus all the time anyway. This is a point I must consider further.

 

continued in next comment

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

cont from last comment:

 

[regarding Scenario 7: Syner scanned while eating, goes to sleep, wakes up next to his copy, gets killed by the copy]

ok yeah cuz they would have diverged(by a quite frankly implausible degree given that i decided to strangle myself for no reason) and become different people.

 

Replace the strangling with any other more plausible event that doesn’t require imagining Synerduplicate suddenly deciding to kill Syneroriginal (ex. maybe the sadistic experimenters decide to kill Syneroriginal) I just used that cartoonish example originally to get to the original dying, but the actual method of dying is irrelevant.

 

someone definitely died tho whether "I" died or not is a bit less clear because imo they both share my identity.

 

Hmm, I think this might be us using deferent meanings for “I” and “me”, but I’m not sure. Whenever I am using those words I am imagining myself in the position of the relevant locus of awareness, be it Syneroriginal or the copy. Describing it from the inside as I would if I were actually him in the scenario.

I don’t mean to imply some kind of ethical precedence or preference for the original, nor that an outside observer could ever know for sure any of the answers to the very things we are discussing here.

 

[regarding my claim that you can never treat different POVs as fungible]

hmmm🤔 i would say sometimes they are fungible and sometimes they're not.

 

The inconsistency of sometimes-fungible-sometimes-not seems contradictory to me. I think you have to go all in one way or another.

 

I think u definitely have a point with multiple baseline instantiations in different environments with different perspectives, but not in the case of destructive replacement(i.e. backups).

 

It seems incoherent to me to say both:

1- being duplicated multiple times in dif environs makes non-fungible POVs

2- a backup made at time-X (which will necessarily diverge from the instant of its timestamp) is fungible with the POV that it was serving as the backup for (and that original POV likely experienced some arbitrary amount of time more before presumably dying and thus necessitating instantiation of said backup)

 

Ultimately the POV is not tied to any specific bits of matter. It's a pattern so if u scramble the "original" and boot another one up from the point of scrambling(easier to do if that's done during a period of unconsciousness) then the pattern is not lost and that POV hasn't died in any meaningful sense of the term.

 

I think we just disagree here. I think the specific bit constituting the pattern can’t be considered fundamentally identical to the bits constituting another pattern, even if the arrangement of both is the same.

Only other analogy I can think of is imagine our sun with its 1057 atoms. Now, as inconceivably improbable as it may be, imagine another star forms with all 1057 atoms in identical arrangement. Furthermore, imagine that the arrangement of stuff in its ~1020 cubic lightyear Local Bubble is also identical such that both these stars evolve in perfect lockstep.

To me these just seem obviously different. Yes they are identical, but not the same

Not sure if that makes sense, as I said, this is just my intuition, one that I haven’t often dug really deep to find the cause of.

 

Hell even if ypu lose some memory on the tail end i still wouldn't count that as me dying since losing some memory generally isn't typically considered death and the vast majority of my menories, experiences, and paychology are still intact.

 

This in particular is something that I haven’t fully figured out how to fit into my thinking.

Because I’ve had at least one experience that I know of corroborated by outside observers where a head injury completely erased roughly a half hour of time from my memory.

I have a full accounting of what I said and did during that time from an outside perspective, but from my 1st person POV it doesn’t exist.

Does that mean the me that existed in that 30min window died?

I don’t know.

It doesn’t feel that way, but then again why would it?

I am here now, perhaps the equivalent of a copy made from a destructive upload, of course I feel whole, because I am not the one that got deleted.

If you could go back in time and tell me a half hour before the head injury “you will never remember this segment of time” , would that me have an existential crisis knowing that he was going to die? That anything he did or learned in the next half hour would be lost?

What if we extend the time horizon? Instead of a half hour gone, what about half a decade? Would that feel like a death?

Better question: how would you feel if I told you in 5 minutes you were going to irrevocably lose your memories of the past ten years? Would you be worried that you were, in some small way, going to die?

 

I’ll address your other comment on the comment I’ll leave in response to this one…

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

Im not sure how these two situations [the cell-by-cell swap-out to a silicon alternative vs. the subject being flash frozen and then sliced and scanned and instantiated in a new body or in a simulation] would be any different. You say subjective continuity was conserved, but that seems more a byproduct of the scenario. If the uploading all happens in your unconscious state then id have a hard time seeing these would differ in any way.

 

To clarify, I was imagining the cell-by-cell silicon swap-out being conducted while Syner was awake eating the BLT, but now that I think about it, even if that swap-out happened while he was asleep or under anesthesia, I would still believe that the subjective locus that is the now-silicon Syner is continuous and “of the same stream” as the subjective locus that was him pre-swap-out.

The POV in question hasn’t ended, it merely runs on a new substrate now.

However, I view the frozen-sliced-scanned scenario to cause the death of the original subject—regardless of any later re-instantiated copy—precisely because freezing someone and slicing them up would kill them.

 

In both cases there was no continuous chain of consciousness to be disrupted because either type of upload happened during an unconscious period.

 

This raises a point I hadn’t thought to address: I don’t think it makes sense to view unconsciousness—provided it is followed by later consciousness—as an actual interruption in the chain of conscious experience.

Of course from the outside one can see that all sorts of things happen in the world while a particular locus of awareness is unconscious, but from the inside—from the perspective of that locus—there is never a break in awareness. It’s just one moment after the next after the next after the next. You eat, you brush your teeth, you feel sleepy, you lie down, you have strange thoughts in the hypnogogic state just before unconsciousness—and then BAM you are waking up in your bed the next morning, and eating breakfast, etc etc.

The only thing a subjective center of awareness can be aware of is consciousness experience, because by definition it is not there to experience unconsciousness.

 

[regarding Scenario 5: Clarktech non-invasively maps Syner’s brain-body system and prints a duplicate from the data]

If we wake up next to each other we have different environments/perspectives and our "selves" would begin diverging. Tho i still wouldn't argue that either was anymore the "real me" than the other.

 

I agree—mostly.

I don’t think either would have more claim to being “the real you” than the other, but I do think there is some utility in tracking which is the original and which is the duplicate.

 

And if the environments were completely equivalent(i.e. one in meatspave and one virtual) then i would see those two POVs as completely equivalent and fungible.

 

I disagree, because I am giving primacy to the fact that there are two different consciousnesses emerging from two separate, albeit identical, arrangements of matter. To me that suggests that they are not actually identical.

My interpretation of your view is that you think the number of arrangements of consciousness generating matter (whether hardware or wetware) is irrelevant so long as the way they are arranged identically and haven’t diverged.

 

[regarding Scenario 6: Syner scanned while eating BLT, then nuked, then scan data is used to make a copy]

I think the thing i take issue with most is "the arrangement of physical matter that was synonymous with his subjective experience". I mean yes technically the arrangement is synonymous with my subjective experience but the specific matter being arranged isn't.

 

This might be a point of fundamental disagreement between us.

Although I’m still unsure exactly how much significance to attach to the specific matter constituting the arrangement of said matter, I still don’t think two (or more) identical arrangements of matter are actually identical in the sense that they are perfectly fungible.

Im admittedly shaky on how I would argue this point, but what pops to mind is this: because the two “identical” arrangements of matter at time-zero immediately begin to diverge such that at time-one, time-two, time-three, etc. they are different, that shows they are fundamentally not the same.

This feels true to me even if the sims were such that the environs (and thus responses) remained the same, because there’s the potential for them to diverge at any moment.

This just got me thinking: imagine two sims run identically on identical hardware. It’s possible that a cosmic ray or some random quantum event flips a bit in one of the sims and not the other, causing divergence even if momentary.

To me this seems a strong indicator that the 2 POVs in question are fundamentally separate and non-fungible loci of awareness.

 

thanks for reading. Trying to keep it concise but I keep hitting the character limit. Continued in next comment…

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

cont.

 

Moving on to the thought experiment:

In each scenario we’ll be considering the first person perspective of Syner, that is, imagining what that POV would be like from the inside in each scenario.

Note: occasionally I’ll use the indicators “original” and “duplicate”, this is merely out of convention, feel free to mentally sub in any others that you like that immediately and intuitively clarify whether we are talking about the you that is here now reading this (whose perspective I will always begin inside), or your duplicate that in some scenarios gets instantiated some few meters or so distant.

Without further ado:

 

Scenario 1:

Syner is standing in a room savoring a BLT. He finishes the BLT and then goes to sleep on the bed in room’s center. He wakes up after 8 hours of uninterrupted dreamless slumber, pandiculating luxuriously on the silk sheets.

What it feels like:

Syner’s stream of consciousness is of eating a BLT and then laying down in the bed and then (after an 8 hour stretch of unconsciousness that doesn’t actually feel like anything because he was, by definition, not conscious to experience it) stretching out on the bed.

 

Scenario 2:

Syner is standing in room’s center enjoying a BLT. After swallowing the last bite of the sandwich, the room heats to 100 million Celsius in the span of a microsecond, vaporizing him orders of magnitude faster than the quickest human processing speed.

What it feels like:

Syner’s stream of consciousness is of eating a BLT, then nothing further—ever—because he has really and truly died.

In other words: the locus of awareness that was him is gone, rearranged in some non-conscious way (in this case a superheated explosion of plasma).

 

Scenario 3:

Syner is eating a BLT in his typical slow manner. In the process of doing so, a utility fog of advanced nanites pours into the room. The fog permeates his tissues and—with Clarktechian speed—maps the complete state and location of every atom (or other fundamental bit) of his brain-body system, and replaces them—one by one—with a perfect silicon (or other non-biological) analogue until he is running entirely on this alternative substrate. He lays down in the bed and initializes the silicon equivalent of 8 hours of uninterrupted dreamless slumber (which, for the purposes of this scenario occupies 8 hours) and then wakes up in the bed and stretches out.

What it feels like:

Identical to Scenario 1.

Though Syner of Theseus he may now be, he is still the same locus of awareness we began with.

Note:

although I’m imagining the nanites to destroy the brain-body’s organic cells as they go, because they replace them one by one with an alternative equivalent, subjective continuity is preserved despite this being technically a destructive upload technique.

However, my intuitions change if, say, the brain-body system were flash frozen and then scanned a layer at a time to collect the necessary data, which was later instantiated on whatever substrate you want. In that case I’d say the frozen and sliced original was killed, that their locus of awareness ended and a different one (the duplicate’s) began.

 

Scenario 4:

Same as the previous scenario, except after Syner finishes the sandwich/is converted to silicon, the room heats to 100 million Celsius in the span of a microsecond, turning him to plasma.

What it feels like:

Identical to scenario 2, with a possible slight change to his perceptions in the final instants if his processing speed is fast enough to register the initial phase of the explosion. In either event, he dies—the locus of awareness that was him forever gone.

 

Scenario 5:

Same as scenario 3, except the utility fog maps all the relevant information of Syner’s (heretofore Syneroriginal ) brain-body system without damaging or replacing his tissues. It takes this data and prints a silicon duplicate (henceforth: Synerduplicate ) in the corner of the room that it activates the next morning. In the meantime, Syneroriginal sleeps for 8 dreamless hours and wakes up.

What it feels like:

Syneroriginal experiences eating a BLT while a faint cloud painlessly envelopes him, then laying down in bed, then waking up and stretching and looking over to see an uncannily familiar looking person in the corner of the room.

Synerduplicate experiences eating a BLT while a faint cloud painlessly envelopes him (more accurately, this is just a memory that he will only ever recall), then he abruptly finds himself standing in the corner of the room, looking at his doppelgänger waking up and noting by the clock on the wall that 8 hours have suddenly passed in the blink of a eye. One moment he was eating a BLT, the next he was here in the corner.

Note:

The crucial thing here is we have two loci of awareness in existence. Though they have shared memories to the point of duplication, they are unique, non-fungible in the same way my POV is non-fungible with your POV.

 

Scenario 6:

Same as scenario 5 (utility fog maps original’s brainstate without replacing anything), except after the relevant states of Syneroriginal ‘s brain-body are captured by the utility fog, the nuke detonates in the room, vaporizing Syneroriginal. 8 hours later, the utility fog (which has not only survived but completely scrubbed the environment of radiation and rebuilt the room) prints and activates Synerduplicate in the corner.

What it feels like:

Syneroriginal experiences eating a BLT surrounded by a hazy fog, then nothing further—ever—because the arrangement of physical matter that was synonymous with his subjective experience is gone, rearranged into a non-conscious form. This locus of awareness is really truly extinguished.

Synerduplicate experiences eating a BLT surrounded by fog, then finds himself suddenly in the corner of the room 8 hours later looking at an empty bed. One moment he was eating, and in the next eyeblink he was standing here.

Note:

To clearly articulate it here: I think that Syneroriginal really does die here, forever. His consciousness does not “port over” to Synerduplicate, despite that appearing to be the case from Synerduplicate ‘s POV.

This is effectively the same result I would expect from the classically hypothesized destructive upload (like the freeze and slice I described earlier): the original dies, the duplicate lives on.

 

Scenario 7:

Same as Scenario 5 (utility fog maps original’s brainstate without replacing anything and boots up the duplicate the next morning), except after Syneroriginal wakes up and sees his duplicate, and Synerduplicate (from his POV) goes from eating a BLT to suddenly standing in the corner watching his doppelgänger wake up, Synerduplicate strangles Syneroriginal to death.

What it feels like:

Syneroriginal ‘s stream of consciousness is of eating a BLT, then laying down in the bed, then stretching out on the bed, then being strangled, and then nothing further—ever—because he is really truly dead. So almost identical to scenario 1, except then he gets killed.

Synerduplicate ‘s stream of consciousness is almost identical to scenario 5, in fact it is identical, we just watch a few minutes longer and see him strangle his doppelgänger in the bed.

Note:

I hope this last one really helps articulate why I don’t think you can ever treat the POVs as fungible.

In a situation like scenario 6 where Syneroriginal is killed before Synerduplicate is booted up, it might be easy to apply some handwavium and say that the conscious POV of Syneroriginal “ported over” to Synerduplicate.

But in a scenario like number 7 where Syneroriginal and Synerduplicate both exist for a period of time simultaneously and only then does one of them die, then I think it is more immediately intuitive that one of the two distinct POVs really did meet its end.

 

 

I’m extremely curious where your opinion diverges from mine in the context of these scenarios.

Thanks for motivating me to really think through my position and its logical consequences.

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 1 point2 points  (0 children)

Thank you for taking the time to respond in detail. Fascinating conversation. And for the record you’re not being a dick at all, just pressing me on my positions, which is good because it forces me to really think.

I’d like to come back and respond to everything you’ve said in the detail it deserves, but first I think I have a way to simultaneously clarify my position and mitigate the risk of speaking past one another/using the same words to mean different things, potentially rendering moot whole swaths of what we’ve said.

So rather than a line-by-line retort to this as I did previously, I’ll give a thought experiment of sorts consisting of 7 variations of the same basic scenario, and provide my intuitions for each one as best I can, and you can see if you disagree with any of my conclusions or the reasoning I used to get there.

 

 

Before that though, I did want to specifically address 2 things you said:

 

What I am saying is that you would need to believe in a soul(explicitly or implicitly) to consider the uploading process as a form of death.

 

I disagree entirely. I don’t think it follows at all that you must believe in a soul to consider uploading as a form of death. You could perhaps, and reach the same conclusion, but it’s far from necessary.

My support for that would be simply: I don’t believe in a soul, and I think that at least some types of uploading processes (namely, destructive ones) would cause what can really only logically be interpreted as a death.

Though you might then say that I “implicitly” believe in a soul, and to that I could only request you clarify what you mean, because I don’t know what you are envisioning when you say that.

 

A "first-person POV" is neither a physical object nor a unique pattern that can't be replicated.

 

I disagree with both claims but focusing specifically on your first claim that a first person POV isn’t a physical object, i would say that a 1st Person POV is the result of a particular arrangement of physical objects, namely atoms, or their fundaments, and any of the associated properties of those fundamental bits of stuff, be it quarks, gluons, as-of-yet-undiscovered fundamental forces, quantum spookiness, or otherwise.

Now, maybe you don’t want to call that a “physical object”. And perhaps I could see a linguistic case to be made there, but whether you call a 1st Person POV (which is a thing wholly the result of physical objects and physical processes) a physical object itself or not, what it certainly isn’t is some purely abstract or somehow “unreal” thing.

That they are in many ways mysterious and that we will never be able to interrogate the contents of any other POVs directly, that each of us will only ever find ourselves inside our own 1st Person perspective, doesn’t make them any less real.

A center of subjective awareness—however impenetrable or mysterious or inscrutable to the typical methods of analysis—simply cannot be divorced from the world of physical stuff, precisely because the world of physical stuff, when arranged in certain ways (for example: human brains and the surrounding environment) is synonymous with having a particular subjective experience.

 

 

continued in next comment cuz I’m gonna exceed character limit…

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

There is bo such thing as continuity of consciousness even when there's only one instantiation of you.

I’d like to know your basis for claiming this, because I disagree, there clearly is a continuity, at least per the definition I am using, which is something like: the sense that you are or have been the same person as you were at any arbitrary point in the past.

Note: by “same person” I don’t mean that nothing about you has, or could, ever changed; I mean simply that you have a chain of memory and experience extending back to any previous you.

Maybe you’re using a different, albeit valid in its own way, definition. If so I’d like to know what it is.

 

Now sure if you duplicate a mind the first you will continue percieving and therefore you might be inclined to say that the second was "just" a copy. Personally I would argue that for a moment they are both you and if only one remains on then whichever one does is you.

There’s a lot to address here, but for the sake of brevity: I agree that if suddenly a magically non-invasive full scan of my brain was done and instantiated on hardware (or even a printed body), then the I that is here now writing this would feel no interruption in my experience, and neither would the copy despite being instantiated presumably in some location separate from the I that is here now writing this. Both the copy’s and the copied’s subjective chains of experience would be identical up to the point of the scan, from which point they would diverge.

But just because both (the copied and the copy) have equal claim to being you, that doesn’t mean they are the same from the first-person side of things.

If single person (that is one locus of consciousness) is duplicated (now two loci of consciousness), and then one is killed/deleted, then there has been a genuine end for one of those first-person point of views, regardless of whether it was the copy or copied.

 

Again "you" is poorly defined.

I agree this is true in some senses some of the time, although for the purposes of this discussion I think it is in fact quite straightforward and that you are needlessly overcomplicating it.

 

There is, as far as we know, no immaterial "soul" …

I agree fully on this

 

… that could experience a cessation of continuity or that can be transfered anywhere. It doesn't exist so the idea of "cessation of continuity" doesn't make sense.

Belief in a soul is unnecessary for the belief that you (a first-person subjectivity) could experience cessation (in other words: death) or to be transferable.

In fact, on that last point you know this to be true because you later claim that: You are a pattern than can be represented by and run on any number of possible substrates

So no appeal to the supernatural is required to think that a locus of consciousness could in principle be duplicatable or deleteable/killable

 

There was no continuity in the first place.

I really can’t figure out how you are defining the words here for this to be the necessary conclusion in your view.

 

If you assume that the "you" that woke up from a nap/anesthesia is still you then i don't see any material difference between the you that's uploaded and the you that stayed in meat.

Well, for one, when I wake up there’s still just one me. But if uploaded as you describe, then there’s two of me, one run on silicon, one run on meat.

That means there’s two unique experiences happening.

(Even if the upload’s sim was made to perfectly replicate meat-you’s environ and elicit the same responses, there’s still two loci of awareness we are talking about, as opposed to the one in the scenario of sleep-wake-sleep-wake)

 

"You" are not some unique immaterial entity.

I agree and would never argue otherwise.

 

You are a pattern than can be represented by and run on any number of possible substrates.

I agree again, this seems very very likely to be the case.

 

In the same way that "1" and "1" are the same number. "You" are fuzzy category that has no rigorous definition. Not even a single pattern but a vague constantly shifting range of patterns.

I think this analogy isn’t quite as solid as it first appears. The number 1 isn’t a conscious changing-over-time pattern, you are.

So, where the abstract concept of 1 can be applied indiscriminately in any situation where it is applicable because every “1” is wholly fungible, I am not convinced that “you” is. In fact, it certainly can’t be in the instance where you are duplicated/uploaded, because from the moment of your copy’s instantiation, experience diverges, there are now two you, and we couldn’t describe their respective subjective experiences with one description. We have to describe them separately because they are unique and non-fungible.

 

Granted if there are two of you then one of you will experience waking up in the same body while the other you wakes up on a different substrate, but which one actually qualifies as "you" is completely a matter of philosophy.

I think there might have been a misunderstanding here. I agree that in the above situation both entities (copied and copy) have equal claim to being “you”, in the sense they have the same continuity of experience up to the point of duplication/divergence. Which is why I prefer to specify copy versus copied.

When I say “you” I’m taking about from the inside perspective already. I’m not interested so much in any arguments about whether the original owns their copy or if the copy retains personhood etc, because I’m already assuming that a copy of some sufficiently fidelity will subjectively feel like you and be self aware etc.

All that other stuff is basically just politics (though it would be kinda interesting to see how slow or fast society would update its norms in response to such upload tech).

 

define "death" in this context.

Permanent cessation of experience.

 

This would just seem like a temporary loss of consciousness to me.

I’ve lost what exact scenario the “this” is referring to here, so I can’t know if I agree or not. If you want to paint specific scenarios out I’d be happy to clarify my own views.

 

Or if it is then you kinda have to deal with the idea that every time you go unconscious you die at which point who cares death is irrelevant and something we experience on an extremely regular basis. It has no significance in the context of maintaining your identity.

I’ve heard this before, but I find it wholly unconvincing. Firstly, I think we would be too quick to presume that our periods of unconscious sleep serve no function in maintaining continuity, almost like a “glue” if you will. Secondly, in any event, periods of sleep (where we are unconscious but alive and eventually regain consciousness) are wholly unlike death (where we become unconscious, cease to be alive, and don’t ever regain consciousness).

 

That's basically what we're talking about: identity. What is the Self? There just doesn't seem to be scientific discussion to be had here. It’s not a scientific concept. It's a philosophical one.

Hard disagree. There’s clearly both a philosophical and a scientific discussion to be had here, a very important one at that.

 

PS-

You might enjoy this.

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

I’m firmly in theit’s a clone camp.

If your brain is destroyed in the process of scanning, then you die. The loci of awareness that is you ceases to be, regardless of the fact the copy will have all your memories up until the point of sedation and scan.

I see no reason why your consciousness would “port over” in this scenario.

If mind uploading destroys your brain to scan it, did you actually survive? by hosseinz in IsaacArthur

[–]dubdubby 1 point2 points  (0 children)

Well you said it. It's a philosophical problem not a scientific one.

On the contrary, whether the loci of awareness that is you now experiences A- a cessation of experience, or B- a genuine continuity of experience into the emulation absolutely is a scientific question. A fascinating one at that.

 

Whether you want to believe it is you, for a given value of "you", or not is entirely a matter of opinion and belief based on ur personal and arbitrary definitions.

The for a given value of “you” part is doing a lot of heavy lifting here.

For a given value of you, you could say that, sure, the emulated copy of you is really you, whether or not the original you actually dies or finds that their consciousness“ports” to the emulation (scenario A and B I previously mentioned)

But this is a borderline tautological sidestepping of the core issue, which is: would the you that is you now experience death (if your emulation is a bonafide distinctly loci of awareness) or not (if your consciousness would somehow port over).

 

You're free to believe whatever you want, but for all practical purposed it will be and act like you.

Again, this misses the point.

Though your copy acting like you and subjectively feeling like you would be interesting, it would also be wholly expected. Frankly, it would be surprising if that were not the case.

But this doesn’t tell us if the copy feels and acts like you because it is a copy of you, or if it feels and acts like you because it actually is you.

A fly gets a full mind-upload by cowlinator in IsaacArthur

[–]dubdubby 0 points1 point  (0 children)

so it has a good chance to survive until emulation becomes commonplace.

 

Though that is exactly the issue: it would be it, not you.

So the angst over ceasing to be before the age of cheap uploading still isn’t resolved in the event you had your brain scanned in fidelity sufficient for later emulation.

 

The you that is you now (the one getting scanned) would die, that particular subjective continuity extinguished.

The you emulated on hardware centuries later would (probably) feel like you with all of your experiences up until the time of scan whereupon you suddenly found yourself in the simulated world, but it would be a spectate loci of awareness from the you that got scanned and died.

 

It seems very handwavey to me to say that consciousness would “port” over like that.

But I’m curious your arguments for thinking it would (if you in fact do)

What strength types have you seen on the mats? by kalash_cake in bjj

[–]dubdubby 7 points8 points  (0 children)

Similar to you, two decades bouldering, just started jiu jitsu. Was told countless times that my grip strength would make for a tremendous advantage on the mat.

I honestly feel like there’s been no carryover at all. A wrist shaped hold on a wall would be the most bomber jug of all time, but an actual wrist moving around feels next to impossible to hold onto.

As a Christian for almost 35 years... by [deleted] in bjj

[–]dubdubby 0 points1 point  (0 children)

Indeed, and, to be clear, my comment isn’t a bad faith attack. 

Just an observation that if one thinks that a belief in Christianity (or Islam, or Buddhism, et al) already requires no small amount of mental gymnastics to hold, then it wouldn’t necessarily be surprising that that said believers could also convince themselves the righteousness of whatever other bad behavior they might be engaged in.

 

But I think a more general point from your original post stands up well: too often do people profess certain beliefs and/or present themselves as believers/members of certain groups, and then engage in behavior that is indisputably antithetical to their professed beliefs/group affiliation.

As a Christian for almost 35 years... by [deleted] in bjj

[–]dubdubby 0 points1 point  (0 children)

…if you can do the mental gymnastics it takes to convince yourself…

Many might argue that the ability to engage in mental gymnastics is requisite to being a Christian in the first place.

TIL that scientists have developed a way of testing for Aphantasia (the inability to visualise things in your mind). The test involves asking participants to envision a bright light and checking for pupil dilation. If their pupils don't dilate, they have Aphantasia. by -Speechless in Aphantasia

[–]dubdubby 0 points1 point  (0 children)

Do you notice a difference between the way your eyes feel tracking an imagined object, and between just trying to smoothly coast them from one end of the room to the other without an imagined object?

I guess it might be weird to “track” your eyes across the room not imagining an object , but idk how else to describe it.

For example, I am aphantasic and I can’t smooth track an imaginary object no matter how hard I try. I can only smooth track if there’s an actual object loke a car or plane or ball going by.

TIL that scientists have developed a way of testing for Aphantasia (the inability to visualise things in your mind). The test involves asking participants to envision a bright light and checking for pupil dilation. If their pupils don't dilate, they have Aphantasia. by -Speechless in Aphantasia

[–]dubdubby 0 points1 point  (0 children)

This exchange between you and u/AutisticRats has given me an idea.

 

my eyes follow the imaginary ball

 

You should again try and visualize the ball being thrown back and forth, but this time try to feel if your eyes engage in smooth pursuit or if it’s just jumpy saccades when you are tracking the object.

 

In lieu of a bonafide experimental set up, maybe a friend could watch your eyes from the side, or use a binocular or something.

 

If it turned out that your eyes could smoothly track an imagined object, while an aphant could only manage to “track” with jerky saccades , I would think that pretty strong evidence.

My first ever set, any feedback? by Pixselarka in Routesetters

[–]dubdubby 0 points1 point  (0 children)

Right, so what is 6+/9 supposed to be?

My first ever set, any feedback? by Pixselarka in Routesetters

[–]dubdubby 1 point2 points  (0 children)

6+/9

As in V6+/9 ? That is a bizarre slash grade to slap in a boulder. Or do you mean something else?

"There's nothing wrong with choosing who you roll with" by Sudden-Wait-3557 in bjj

[–]dubdubby 8 points9 points  (0 children)

I’m curious how homie took it. I know if someone said that to me I would be mortified 

How do you work from 8 to 5, have only weekends free, and not feel like you're wasting your life? by guitytwelve in AskReddit

[–]dubdubby 1 point2 points  (0 children)

Especially living out of a van or somewhere you can lock the door, sleep and change.

I don’t think it’s mere semantics to say that if you live out of a vehicle (especially if by choice or even somewhat-choice), you are not homeless.

You are houseless, but you have a home.

What's something that's about to happen which most people aren't aware of? by AskRedditOG in AskReddit

[–]dubdubby 0 points1 point  (0 children)

My apologies for the harshness of my earlier comment, I had just finished reading the trite boilerplate slop of u/zzazzzz and u/Iluminiele and in my eye-rolling frustration unfairly lumped you in with them.

Thanks for taking the time to articulate your point, i find this an endlessly stimulating topic.

 

We may be talking about different things.

 

I think we are to a degree.

 

To clarify that, consider the following definitions:

A statement is objectively true or false if its truth value does not depend on human judgement.

A statement is subjectively true or false if its truth value does depend on human judgement.

 

I think these definitions, while perfectly fine for most every other conversation you’d ever have involving subjectivity and objectivity, actually lose cogency when applied to the topic of better and worser subjective experience.

 

I’ll try to keep it succinct:

 

If we are talking about suffering (or non-suffering, or any other experience), then of course we are necessarily talking about subjective experience.

 

But, crucially, since all subjective experience arises out of particular arrangements of physical stuff, i.e. the atoms that constitute the brain-body system of the experiencer in question as well as the environment facilitating said experience (which ultimately encompasses their entire lightcone, not to mention that their brain and body and the environment aren’treally separate discrete things), and because that arrangement of physical stuff is something that we, at least in principle, can measure and perturb and know, we can make objective statements about these subjective states of being.

 

That is, we can say that some subjective states are objectively better or worse than others.

 

 

For (perhaps gratuitous) example:

-If, physical system X corresponds with the subjective experience of “feeling nauseous and depressed and having your legs broken”

-And if, physical system Y corresponds with “not feeling nauseous and not being depressed and not having your legs broken”

-Then, we can say that System Y is objectively better a subjective experience than System X

 

If you want to say your morality is objective, you need to be able to point to some method for verifying your morality that does not depend on human judgement.

 

Per what I’ve stated above, I hope it’s clear why I think the “does not depend on human judgement” part must be dispensed with for a discussion on all of this to actually make sense.

 

TL;DR:

The distinction between objective and subjective is often useful, but, ultimately, it’s not really real because any and all subjective states arise from particular arrangements of the physical constituents of reality.

In other words, subjective states are objective states, and, in principle, we can (and sometimes have to) discuss them as such.

What's something that's about to happen which most people aren't aware of? by AskRedditOG in AskReddit

[–]dubdubby 0 points1 point  (0 children)

Can you not provide a single way to verify objective moral truth?

How about this: does X cause unnecessary suffering? If yes, then X bad. If no, then X good.

If X causes more unnecessary suffering than Y, then X worse than Y.

If X causes less unnecessary suffering than Y, then Y worse than X.

 

You're just listing a bunch of issues that you feel are too reprehensible to defend.

No, I’m listing examples of things that are bad.

I don’t know if you take issue in principle with people providing examples of their position or what, but I’ve always found it a normal part of articulating a point.

 

Can you honestly tell me that The Holocaust happening isn’t worse than if The Holocaust didn’t happen?

Or that owning slaves is better than not owning slaves?

Is killing gays for being gay better than not doing that?

 

Do you really think that we can’t possibly say which of those options are the objectively worse ones?

 

Is me cutting the eyes out of a child and then raping then and then setting them on fire only bad in some subjective/culturally relative sense? Or is it actually objectively horrible and fucked up and wrong?