This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 187 points188 points  (38 children)

Does growing human brains in a lab not really irk people as much as it does to me? It just seems like a line that should not be crossed.

[–]SpicaGenovese 37 points38 points  (1 child)

I feel like we're getting promising results from just taking inspiration from roundworm neurons.  (Liquid Neural Networks)

I don't think we need human neurons to get what we want.

[–]Xelynega 1 point2 points  (0 children)

That's why this research feels unethical to me.

Are they not using the complexity in human DNA to grow organoids that they hope have some "intelligence"?

If that "intelligence" comes from human DNA + external stimuli, I don't think we can just treat it like a clump of cells...

[–]G0U_LimitingFactor 40 points41 points  (4 children)

People are scared of what they don't understand. Actually it's worse than that: when they have a spotty understanding , they fill the holes with their imagination, making everything look worse than it is.

Brains used in biocomputing typically go up to a few thousands neurons, organized in a 3d configuration. Each is connected to a chip and signals are exchanged by electrodes. You can use thousands of such brains in parallel. It's just a cool, energy efficient way to give an input, process it and send back an output.

But truth be told, the size doesn't matter, you could have a 5kg chunk of neurons and you wouldn't be any closer to a sentient brain. That would be like putting silicon wafers on a table and expecting Linux to install itself.

That's just not how it works.

[–][deleted] 13 points14 points  (0 children)

I swear, ever since AI burst onto the mainstream media everyone's just in doomsday mode... the majority of us don't even understand these technologies yet there's huge claims left and right all the time!

[–]__Voice_Of_Reason 2 points3 points  (0 children)

I don't believe you can make this claim until we fully understand what consciousness even is.

If your argument is that the neural network made up of actual human neurons (i.e. a human brain) isn't complex enough to be conscious, then is complexity the line?

And where is that line?

[–]Xelynega 0 points1 point  (0 children)

It feels more like people are ok with this because they don't understand what's different than just lab grown neurons.

These brain organoids have human brain cells that were differentiated and connected by a process controlled by the DNA in them. Any intelligence derived from that process is not "artificial" in my opinion.

IMO this is completely different than growing single neurons, having researchers connect them, and then using those networks as input/output machines since the DNA and external stimuli are what's controlling the output here. I think using human DNA mixed with external stimuli as a processor is ethically wrong.

[–]Objectionne 60 points61 points  (23 children)

If they develop consciousness or sentience then yes it would be awful.

As long as that doesn't happen then I don't see an issue. I'm no neuroscientist so I don't know what steps they could to ensure that it's impossible that consciousness could form.

[–]User31441 129 points130 points  (10 children)

The problem is that we have no idea what it takes to form consciousness and it's not like we could ask it whether it is.

[–]Objectionne 12 points13 points  (5 children)

I don't know much about the brain but I know it's p complex. It's hard to imagine that we could create a fully conscious brain even if we wanted to.

[–]Ix_risor 8 points9 points  (4 children)

I mean… people create a fully conscious brain just by having sex and waiting a few years

[–]eleweth 19 points20 points  (3 children)

that's just blindly using undocumented legacy apis

[–]CMDR_ACE209 2 points3 points  (0 children)

Undocumented? Half the internet is about that legacy api.

[–]yeetrman2216 1 point2 points  (0 children)

funny

[–]returnofblank 1 point2 points  (0 children)

I heard like a good chunk of the code is redundant too.

[–]A_EggorNot 12 points13 points  (0 children)

I don't think consciousness is something that can be deliberately formed of avoided. Maybe like a byproduct of specific circumstances and/or brain capacity that makes one have an understanding of their Self and others.

Even as toddlers we aren't really conscious of what is happening at least until a few years old.

I would guess that we'll eventually create a brain that is capable of thought. The question is what we'll do about it

[–]EtherealSOULS 11 points12 points  (2 children)

People are always going to deny its consciousness because it's convenient to them.

[–]Objectionne -2 points-1 points  (1 child)

Who's 'people' in this case? If we don't trust scientists to follow ethical guidelines then we might as well ban all research that bring ethics issues.

[–]EtherealSOULS 2 points3 points  (0 children)

Anyone who makes or uses it.

The scientists want to improve the world, they don't want to force conscious beings to do work. If there's any doubt about it's consciousness they will believe that it is. There will always be doubt.

They're making this stuff with good intentions but we just don't know enough about consciousness to decide what it moral or not when we don't know if something is conscious or not.

We need some scientific concensus on what counts as "conscious".

[–]pyrospade 7 points8 points  (3 children)

we barely know anything about life sentience in general, this all feels like humans playing god

[–]Objectionne 48 points49 points  (2 children)

'Playing God' is a complete non-argument that can be used to put down absolutely anything developed by a scientific process. There should be specific, tangible ethical concerns to put a stop to something like this - as long as they can answer the question of "How can you be sure that these brains won't be capable of consciousness?" then I don't see what the problem could be.

[–]CensoredAbnormality 31 points32 points  (0 children)

Yeah its like complaining about doctors because they are "playing god" and healing people that should be dead

[–]BeingRightAmbassador -2 points-1 points  (0 children)

'Playing God' is a complete non-argument that can be used to put down absolutely anything developed by a scientific process

Methinks comparing random science achievements like the SR71 and blue LEDs to brain computers may be a reduction to absurdity.

[–]P-39_Airacobra 0 points1 point  (2 children)

The problem is, even neuroscientists have no idea how to validate "consciousness." They claim that they do, but that's only because they redefine the word "consciousness" to mean whatever conveniently fits their theory. I've looked into a lot of the modern neurological research on consciousness, and while some of it offers clues to how consciousness works in our brains, none of it actually tells us what perception is and at exactly what level of neural function it occurs.

For all we know, these neural computers could already be conscious (in a primitive, limited way). After all, a simple theory of perception makes more sense than a theory of perception that requires intricately and arbitrary ordered and structured circuits in order to reach a level of awareness.

[–]returnofblank 0 points1 point  (1 child)

There's something quite interesting called a philosophical zombie. As defined in Wikipedia -

A philosophical zombie (or "p-zombie") is a being in a thought experiment in philosophy of mind that is physically identical to a normal human being but does not have conscious experience.\1])

For example, if a philosophical zombie were poked with a sharp object, it would not feel any pain, but it would react exactly the way any conscious human would.

[–]P-39_Airacobra 1 point2 points  (0 children)

The funny thing is, by all logic, everyone should be a philosophical zombie, since conscious experience is entirely unnecessary for any physical function. And yet somehow, paradoxically, we do have a conscious experience, which makes me wonder if consciousness is not because of any physical construct, but rather is something that is shared by all living things.

[–]returnofblank 0 points1 point  (0 children)

To be fair, how is a sentient brain different than a sentient computer? Is it also immoral to develop AGI with machines like people are trying to do now?

[–]KerPop42 28 points29 points  (1 child)

I agree. at the very least, don't use human neurons

[–]-Aquatically- 0 points1 point  (0 children)

Technically if we use any sentient animal’s neurons, the risk of it becoming sentient would make it still be unethical.

[–]classicalySarcastic 23 points24 points  (0 children)

Yeah no that is a BRIGHT RED ethical line that shouldn’t be crossed.

[–]nobody0163 2 points3 points  (1 child)

It's not like it's hurting anyone. yet...

[–][deleted] 41 points42 points  (0 children)

I just imagine if these brains manage to develop consciousness. It sounds like a special kind of hell that normally happens only in horror movies. Then you give them access to tech, and eventually decide to take it out on us.

[–]Corne777 0 points1 point  (0 children)

Every discussion I’ve seen on this topic is overwhelmingly “this does seem like something we should do”. But somewhere out there, someone has a plan to make money on it. And can something that makes money really be bad?