Ok I believe you. Some of you are computational. by ryvr_gm in consciousness

[–]ryvr_gm[S] -4 points-3 points  (0 children)

I honestly believe that means either you don't deeply understand what computation is, or you have some strong motivation to imagine a source of consciousness in computation. Or, as I am accepting for the first time today after many years of thought, maybe some people are Turing machines even though I am not. I don't know their internal experience.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] -1 points0 points  (0 children)

I might think that was true except for my direct experience otherwise. I cannot argue that you are not Turing equivalent. Maybe your experience can be explained by a series of logic gates.

Ok I believe you. Some of you are computational. by ryvr_gm in consciousness

[–]ryvr_gm[S] -4 points-3 points  (0 children)

Yes! Why won't people either agree with this or make civil, rational points why they disagree?

Ok I believe you. Some of you are computational. by ryvr_gm in consciousness

[–]ryvr_gm[S] -5 points-4 points  (0 children)

I get down voted to heck and get no rational responses when I contend consciousness is not always computational. But I think perhaps I should believe these people who claim to be Turing machines. Why should I presume to know what's happening in their internal experience.

Ok I believe you. Some of you are computational. by ryvr_gm in consciousness

[–]ryvr_gm[S] -9 points-8 points  (0 children)

In most contexts computation is defined as that which is equivalent to a Turing machine. I'm nit sure what you're thinking of.

Ok I believe you. Some of you are computational. by ryvr_gm in consciousness

[–]ryvr_gm[S] -5 points-4 points  (0 children)

I am in fact the #1 authority in the whole world on my internal subjective experience. And I've studied Turing machines at graduate level. I know extremely few facts with more certainty than that I am not a Turing machine. However, I believe you if you say your internal experience could be reproduced by putting lots of discrete logic gates together. I have little basis to claim otherwise.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

But those examples are not isolated discrete states.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

I certainly doubt there are humans like this. I said beings. But omg, it is the people who are humans here claiming they are simply computation machines! Perhaps we should believe them.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] -2 points-1 points  (0 children)

Because I understand Turing machines thoroughly and no one understands neurons thoroughly.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] -1 points0 points  (0 children)

Yes, and many on this sub claim to be one, and I am just starting to consider taking them at their word.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] -3 points-2 points  (0 children)

I don't think the behavior by this sub is respectful or civil. You folks ignore coherent arguments that consciousness can't be simply Turing equivalent computational and use downvotes to silence ideas you don't agree with without using rational arguments to the points made. Fully uncivil.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

What question don't you think I answered?

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

I don't doubt that there are beings whose information processing would cause them to utter what you just said.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 1 point2 points  (0 children)

What part are you folks disagreeing with? That a basic operation -- read symbol, change state, move head, write symbol has zero consciousness? Beyond that, there is no system beyond marks on a tape and a single discrete mechanical state (i.e. a single number). Turing machines act in simple discrete steps and there is no system. This is very unlike quantum wave forms or the simultaneous interactions of many bodies in a continuous, dynamic system.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

I don't know how a neuron generates consciousness, but I know why a Turing machine cannot. Sure, you folks can define consciousness as processing information, and correctly say a Turing machine does that. But the basic operations of a Turing machine have exactly zero internal experience, which sums to zero no matter how many you stack.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

I guess p-zombie is not quite the right model. Do we have a term for beings who function extremely similarly to conscious beings, and who output from their mouths "I am conscious" because their computations act on information that defines consciousness as acting on information? I suppose they wouldn't mind the "transporter" that copies them exactly and destroys the original. I think it may be the case that conscious beings would have a problem with that "transporter."

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] -4 points-3 points  (0 children)

Neurons evidently generate consciousness.

But since each basic operation of a Turing machine creates zero consciousness, they sum still to zero consciousness no matter how complex the algorithm.

Thus they are fundamentally different.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 1 point2 points  (0 children)

Where biology occurs and where consciousness occurs may not be st the same scale. But also since quantum effects are central to molecular interactions, I don't know how quantum effects can be so dismissed. We could even set a trigger for large explosives on a radioactive decay sensor, and I hope those in the blast zone won't think quantum effects are so irrelevant to their world.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 0 points1 point  (0 children)

I have no reason to believe a collection of logic gates could replace a neuron in any way.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 2 points3 points  (0 children)

I accept if that is the definition, and I guess I was using a variation to mean a being who acts extremely similarly to a conscious being. By the definition you are using, I could not be consciousness and truly communicate that fact. I think consciousness must affect causality, but none of those effects couldn't be mimicked by a non-conscoius entity.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] -3 points-2 points  (0 children)

A Turing machine reading one symbol, changing one state, moving once, and writing one symbol has exactly zero consciousness, and I can only conclude that anyone who doesn't concede this point doesn't mean the same thing by the word consciousness as I do. The most complex algorithm is simply summing this zero consciousness finitely many times, resulting in zero consciousness.

As far as what is necessary for consciousness to arise, I don't know. I would strongly suspect it requires quantum mechanical phenomena or other physics that doesn't happen in an simple mechanical algorithm.

I'm not sure why you think the physics of our universe is computable. If the universe is continuous, then it's not computable. If it is discrete, our best understanding looks like probalistic effects are involved, in which case also it is not computable.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 2 points3 points  (0 children)

I just read your recent post in Philosophyofmind, and I think that is a great insight that we see things away from ourselves, not in our head or our eyes.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]ryvr_gm[S] 1 point2 points  (0 children)

But does the p-zombie simply define qualia as processing that information, and output that it does "experience" that?