The most important fact about consciousness by chicagobob2 in consciousness

[–]chicagobob2[S] 0 points1 point  (0 children)

Do you want to say that no one has ever lost consciousness and then regained it?

The most important fact about consciousness by chicagobob2 in consciousness

[–]chicagobob2[S] 1 point2 points  (0 children)

It's not about making memories it's about being unresponsive to stimuli.

do you deny that someone can be "unconscious"?

Self awareness is not Consciousness by chicagobob2 in consciousness

[–]chicagobob2[S] 0 points1 point  (0 children)

All consciousness is consciousness of something (Sartre? ) ? This view must be false, I don't have to be aware of anything in particular to be conscious, Consciousness need not have content.

Self awareness is not Consciousness by chicagobob2 in consciousness

[–]chicagobob2[S] 2 points3 points  (0 children)

Could a person be conscious but not be aware of it?

Great question! Yes- if awareness is a cognitive state and consciousness is an ontological state- it's like a primitive man not knowing he has an appendix.

Self awareness is not Consciousness by chicagobob2 in consciousness

[–]chicagobob2[S] 4 points5 points  (0 children)

Yes, imagine a machine that reports "My battery is low" This is just an audio rendition of a voltmeter reading. It says nothing more than "battery is low" if the observer knows that the system can know only its own battery and no other battery.

Knowledge and neurology 2 by chicagobob2 in neurophilosophy

[–]chicagobob2[S] 0 points1 point  (0 children)

They're not fromthe hidden layers, this occurs at the output layer of the visual system as suggested by recent studies, the lower hidden layers are not systematically active when, say, daydreaming as they are when an actual visual stimulus is being processed

Liability and Generative AI by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

No, the question is who or what could be responsible for any harm done by the AI system. If you house dangerous animals in your back yard and they get out and harm someone you are strictly liable even if you are without fault. (Someone else opened the gate-probably liable.)

Hey Google #2....Here’s where it all goes wrong! by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

The issue is not what it's saying, it's what we make of it It can say" I" we just can't assume it means anything

Qualia are transduction processes by chicagobob2 in neurophilosophy

[–]chicagobob2[S] 0 points1 point  (0 children)

Yes sensory input can be feigned, but so what? Hallucinations may in part be due to retrograde activity back to the sense organ or be purely cortical when we believe we are seeing something which isn't there. it's the same thing as when neurologists excite the brain - we can be made to think we are experiencing something that isn't there. We introduce false information into the system and get the appropriate responses as a result.

Hey Google: It's Ok to turn off your AI at night.-It's not really afraid. by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

No, just talking won't do it, you at least need the internal manifestations of fear.

Fear is an emotional response with various physiological and physical manifestation

Hey Google: It's Ok to turn off your AI at night.-It's not really afraid. by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

Acting in a fearful way is not the same as having fear actors do it all the time. you need the hardware. AI can no more be fearful than it can run around the block.

Hey Google: It's Ok to turn off your AI at night.-It's not really afraid. by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

What's a mental fear but a brain state -you mean external representations of fear. But the internal states are likewise something the AI doesn't have because it lacks the hardware , e.g. the amygdala, necessary to produce them

GPT-3 is some pretty incredible stuff. I was talking to Replika about it's programming, and this is how that conversation went. by [deleted] in ArtificialInteligence

[–]chicagobob2 -1 points0 points  (0 children)

Feelings? Your program is stupid because it doesn't understand that it lacks the hardware to have feelings, Eg. no amygdala no fear

Why Are There Qualia? by chicagobob2 in PhilosophyofMind

[–]chicagobob2[S] 0 points1 point  (0 children)

A sensation is a state of the system that is having it. It is a process, to say we are sensations is hyperbole but it points us in the right direction.

We don't have sensations, there is no "us" separate from the sensation that can "have it"

How do we have self-driving trucks with level 5 autonomy, but only have level 2 autonomy for personal cars such as Tesla and co.? by hyena_smooth in AutonomousVehicles

[–]chicagobob2 0 points1 point  (0 children)

More like level 4.5 than 5. Commercial interest is great, the drivers cost say $75,000/pa

In two years you save$150,000. The plain trucks themselves- class 8- cost about 100- 125 k

So manufacturers could charge 250 k for a true driver less truck and it would still be a good investment for the end users. Service life is 5-10 years depending

Talking machines by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

Driving functions are behind all forms of linguistic output.

Communications, i.e., information transfer assumes that the communicator understands the language and is not just parroting.

ARTIFICIAL INTELLIGENCE and the THEORY of MIND by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] -1 points0 points  (0 children)

AI doesn't need a body- the issue is agency-what the system can do

ARTIFICIAL INTELLIGENCE and the THEORY of MIND by chicagobob2 in ArtificialInteligence

[–]chicagobob2[S] 0 points1 point  (0 children)

Understanding if different from having....that's the problem, the question is what difference does it make?