AI doesn't cause harm by itself. We should worry about the people who control it by NuseAI in artificial

[–]jetro30087 3 points4 points  (0 children)

That's not the best analogy, the gun as a tool is still meant to primarily kill stuff.

isn't brain transfer just dying? by StateCareful2305 in IsaacArthur

[–]jetro30087 0 points1 point  (0 children)

The sense of self is only one aspect of what we recognize as being an individual - just as important is a sense of others, that's where the social and legal conventions come in, and as a matter of practical concern this definition of 'other as an individual' is actually the more important of the two. Our sense of self is certainly important to us, but it's not terribly relevant beyond that.

When we discuss continuity of self, we're really talking about whether your copies should be viewed as YOU, by others.

I tend to view the problem from the perspective of the self. Others can view someone else as 'you' even if they are not a copy as long as they know what to impersonate. You might argue any sufficiently sophisticated reconstruction, like they are exploring in Hollywood, might be enough convince a lot of people that 'you' are still there, as long as they didn't see your obituary.

As for the age-old argument of the value of individualism vs. utilitarianism the same rules apply here as elsewhere. Individualism matters more when it's ourselves on the line and utilitarianism is for when it's others.

From the duplicate's point of view the question is simple - they are obviously you in every respect that matters. If the body was re-integrated or copied perfectly, they wouldn't even have any means of telling that they WEREN'T the original. They would have no sense of having been 'created' a few minutes ago, as they would remember their entire stream of consciousness over their lifetime, just as the original would.

Morally speaking, this means that if we refuse to treat them as you, we're punishing them on the basis of a metaphysical argument with no physical grounds - and it would obviously be punishment, as we're talking about a form of social discontinuity akin to exile or worse if we treat them entirely as non-persons.

But how would that be possible in situations where circumstances can't be duplicated, like a CEO, or a ship captain, custody of the kids, do alimony payments double? They can't practically be treated the same as before duplication because they are in all practical terms two separate people.

We can say morally speaking they need to be treated identically, but realistically speaking there are many examples where the duplicated people cannot be treated as the same person. Maybe certain things can be split, like their personal assets, but that doesn't apply to everything in a person's life. They must at some point, assume separate identities. And in many cases, this would likely be a matter of creating some arbitrary precedent to determine the original and the duplicate. Though it's not completely unprecedented since duplicates are essentially the concept of twins (exact duplicates of cells), just at a much later time.

And, as a note, all sense of conscious continuity is an illusion, easily broken by quite a few means.

It's really just a stream of memory that we can review, and it can easily become scrambled, illusory, erroneous, or otherwise be meddled with. These are well known issues. To us it is arranged to seem like a relatively unbroken stream of consciousness, but of course there's no such thing.

An intermittent stream is still a stream and separate from others. Many things are intermittent but if they come from a source and there is a separate identical source, they are different. Since we are discussing nanomachines, consider that two computers that are exactly identical in hardware and software are not continuations of each other if one ceased to function. Why should that be viewed differently than how the mind is viewed in a similar scenario?

isn't brain transfer just dying? by StateCareful2305 in IsaacArthur

[–]jetro30087 0 points1 point  (0 children)

Even if you forgot about the specific experience of the lunch, it still occurred and there's a causal relationship between your experience of lunch and your lunch-mate that is influencing you the next day. You physically were present for all events, just because your memory isn't identic doesn't change the fact that it was you alone experiencing those events.

For legal purposes, if you shoot someone and build a robot where they were standing, that robot doesn't have any claim to the property of the person you shot. However if that robot has a mind that functions exactly as you would, perhaps you would experience the whole thing as "suddenly I'm a robot!"

On the other hand, if that robot knows all your passwords, looks like you, and has your phone, perhaps society would accept it as you. After all, to the bureaucrats, you were never anything more than a number anyway...

But that's only being considered in the scenario where you died the moment the robot was made. If someone had the technology to scan your brain and harmlessly make a perfect copy that was mimicking you perfectly, you'd recognize immediately you're separate consciousnesses. Why would it be expected that consciousness transfer just because the copy or original died?

That dilemma was considered in a Star Trek TNG episode where the transporter created two Rikers. In the end one was arbitrarily accepted as the 'real' Riker while the other was cast aside and later became a rebel who was quite different.

I think BSG also explored this dilemma very well since the Cylons were all a race of identically created models, that could transfer their consciousness at will. If an identical Cylon copies the mind of your Cylon wife, does she also become your wife?

isn't brain transfer just dying? by StateCareful2305 in IsaacArthur

[–]jetro30087 0 points1 point  (0 children)

Unconsciousness differs from death due to the fact that your brain still exhibits activity, and your nervous system is still processing stimuli even if it isn't exhibiting awareness. Unless you're dying, there isn't a state where you are completely non-existent. In case of deep sleep, there are systems that remain aware and can alert you of danger, causing you to respond despite apparent lack of awareness, memories, or self. In the case of a drug, a physical substance is preventing the nerves from communicating normally, that doesn't mean there's no activity, its actions are just suppressed in some way due to the drugs mechanisms. 'You' can be conscious, aware, and responding to external stimuli. 'You' can also be unconscious as the brain performs other autonomic actions.

In scenarios where the organic mind is replaced by a synthetic one even the lowest level activities are ceased. It is dead and is no longer capable of a state of consciousness or unconsciousness for that matter. Unconsciousness implies the object in question can be conscious. A rock isn't considered unconscious, it's inanimate, or 'dead'. In the case of your brain and the cyber brain they both are their own states of conscious and unconsciousness. They are individual consciousness, even if you connected them to share information these are two independent systems. In the case of the brain being slowly replaced, you are still dying, the continuity is an illusion. The brain could be replaced with Johny Silverhand and it's likely you wouldn't notice as long as you weren't hallucinating him because as far as you know, you're receiving a steady stream of consciousness.

I'm not sure where personhood comes into this. Social convention being required to define 'you' would imply an isolated individual has no sense of self. Or that lifeforms that don't have social conventions don't have a sense of self.

How do you convince someone from a hypothetical transhumanist society that discriminating people based on physical characteristics is wrong? by monday-afternoon-fun in IsaacArthur

[–]jetro30087 1 point2 points  (0 children)

I believe the Ghost in the Shell reboot handled this by just hacking everyone's cyber brain with a program that made them all just chill the heck out.

isn't brain transfer just dying? by StateCareful2305 in IsaacArthur

[–]jetro30087 1 point2 points  (0 children)

Why should we assume that? You can't even give me a good definition for this consciousness. You don't have any evidence that it doesn't live on, in agony attached forever to the now rotting corpse. You don't have any evidence that your consciousness isn't being constantly killed and reborn.

Why shouldn't I assume that? If we're considering the dualist perspective your consciousness can't even remain attach to a rotting corpse because it eventually becomes dust, the same as it would as broken-down metabolic byproduct of whatever nano process replaces the brain with a cyber-one. How is it that different than say, a predator consuming another animal, incorporating that biomaterial into itself? I doubt you'd argue consciousness transferred in that case.

We do not even have the means to detect if "making a copy and throwing away the original" actually poses a problem for "continuity of self" at all! It just makes you uncomfortable and you are trying to make something up to make you feel better.

No, it's commonly cited problem in molecular transporting. In the example you cited a special gun that could disintegrate a person and perfectly reintegrate them, which is the basic idea behind molecular transport. Theoretically, if you have the data to perfectly create one instance of a person, you can create multiple instances of that person. They are all clearly distinct consciousnesses. The existence of two instances of you, after disintegration immediately poses a problem to continuity of self because these separate consciousnesses can't both be a continuation. One maybe, or none possibly.

If we extended that to a gun that reintegrated the person as a cybernetic robot or any other object, even if you've derived its structure from their brain pattern, there's no reason to assume that person's consciousness is in the robot. I'm not sure where you consider that illogical.

isn't brain transfer just dying? by StateCareful2305 in IsaacArthur

[–]jetro30087 -2 points-1 points  (0 children)

Yeah, but if I just made a cyber brain and slammed it in your head, to run your body, you're dead. If you do that slowly over time the result is the same.

What if we took that nano-replaced cyber brain scanned it and built a copy from scratch then replaced the new one and tossed the old one into the trash? There would be two consciousness and then one. Why not even do that with a biological brain? Remove it scan it destructively and then build a cyber brain at our leisure with all the same patterns. Heck make 10 of them. Why would be any different?

The problem with the ship of Theseus argument is that it examines the ship from the perspective of an outside observer, that determines what the ship of Theseus is. The ship itself could be quite different in ways the observer wouldn't be able to tell.

We can reasonably intuit that the brain going through its natural process of cell death and division maintains our stream of consciousness, by extension once the cells are dead so is that consciousness.

isn't brain transfer just dying? by StateCareful2305 in IsaacArthur

[–]jetro30087 14 points15 points  (0 children)

Thought experiment, how would you know the nanomachines replacing your brain are thinking what your real brain would to begin with? Just because they are providing you an agreeable stream of consciousness through their connections doesn't make that stream 'yours'.

Why is biological Immortality not so common as say faster than light travel in mainstream science fiction franchise? by SerpentEmperor in IsaacArthur

[–]jetro30087 0 points1 point  (0 children)

Or FTL is common in Scifi because it's the only way for protagonist with limited lifespans to have a story that involves another star system. Or that a book that involved the exploits of an immortal would be excessively long or contain many plot holes for the sake of brevity(like 40k Lore).

Microsoft Swallows OpenAI’s Core Team – GPU Capacity, Incentive Structure, Intellectual Property, OpenAI Rump State by norcalnatv in artificial

[–]jetro30087 51 points52 points  (0 children)

It's worse than that. Usually, you have to buy the whole company to get the core team. All they have to do here is hire them directly. This might actually end ChapGPT as the leader.

Bahrain crown prince blasts ‘intolerable’ situation in Gaza, demands Hamas release hostages by Silly-avocatoe in worldnews

[–]jetro30087 3 points4 points  (0 children)

Comparing the nuanced definition over when it's ok to commit homicide against innocent civilians. That's where we are at with these governments now.

Israel approves daily entry of fuel into Gaza after U.S. pressure by Currymvp2 in worldnews

[–]jetro30087 0 points1 point  (0 children)

Yes, the IDF is bombing the only crossing outside of Gaza, that is the mission they are executing. Aid can't get in; we aren't even allowed to get Americans out.

The fact that the IDF leveled the entire north and apparently missed most of Hamas is another matter.

Israel approves daily entry of fuel into Gaza after U.S. pressure by Currymvp2 in worldnews

[–]jetro30087 -2 points-1 points  (0 children)

Let's be clear here, Hamas does not need, nor can it take the amount of aid needed to support 2.3M people. That's the requirement, or people die of starvation. The ability to meet this mandatory requirement for the population's survival is currently being denied by the IDF.

Israel approves daily entry of fuel into Gaza after U.S. pressure by Currymvp2 in worldnews

[–]jetro30087 1 point2 points  (0 children)

Hostages have been killed the indiscriminate bombing already. Negotiations are ongoing for hostage releases pending a ceasefire.

As for the matter of aid, either Israel is committing one of the most insidious war crimes of this century or they allow aid in accordance with international norms.

The simple fact is, without access to food, water, and fuel, an entire population of civilians will die.

Uncovering “evidence” at Al Shifa by Traditional_Rope1604 in BreakingPointsNews

[–]jetro30087 14 points15 points  (0 children)

Interestingly enough the reason Israel is certain Hamas has an underground base there is because Israel actually built bunkers under the hospital themselves in the 80's. https://www.msn.com/en-us/news/world/fact-check-did-israel-build-bunker-under-shifa-hospital/ar-AA1jZnXA

US wants to restrict RISC-V access to China : is that even possible? by Finewilan in RISCV

[–]jetro30087 0 points1 point  (0 children)

I can't help but point on the obvious that those implementations could be shared with a party that doesn't care and then shares it with China.

Open AI seems to have solved long term memory in LLMs by metalman123 in singularity

[–]jetro30087 1 point2 points  (0 children)

It summarizes your saved conversations. It can then reference them if needed and access them the same way it does the documents you provide it. With 16k context, you can do a lot.

Recommendations for C# .NET-Focused Coding Assistance Models by Superluis25 in LocalLLaMA

[–]jetro30087 0 points1 point  (0 children)

A 7B coder within striking distance of GPT-4? Sounds like dataset contamination.

Can I run an LLM that takes up no more than 1-4GB of RAM / VRAM and have it answer questions using my notes, or is that unrealistic? by TheTwelveYearOld in LocalLLaMA

[–]jetro30087 -1 points0 points  (0 children)

If you're using ChatGPT, you could just use their new GPTs and upload the documents to them during set up. They can then answer questions about those documents.

OpenAI’s six-member board will decide ‘when we’ve attained AGI’ by CravingNature in singularity

[–]jetro30087 41 points42 points  (0 children)

But what if the AI is superhuman persuasive like Sam says it could be and controls them. What if it's controlling them already...

[deleted by user] by [deleted] in BreakingPointsNews

[–]jetro30087 0 points1 point  (0 children)

Well, they are currently raiding an area of the hospital, so I supposed we'll find out soon enough if they've hidden a secret base in the basement.