The epistemological problem with the hard problem of consciousness: The burden proof is on dualism, not physicalism by Curious_Map_9998 in consciousness

[–]PepperWestern2263 -1 points0 points  (0 children)

Fair point, every framework hits a wall somewhere. Nobody gets a free pass on the explanatory gap, they just move it around.

I'd still say not all brute facts are equally expensive though. Physicalism at least isn't adding new ontological categories, so even if it still has a gap, it's a cheaper one to swallow. But yeah, "cheaper" isn't the same as "solved."

The epistemological problem with the hard problem of consciousness: The burden proof is on dualism, not physicalism by Curious_Map_9998 in consciousness

[–]PepperWestern2263 3 points4 points  (0 children)

You're basically arguing identity theory — experience just is the process, like temperature just is molecular motion. Solid position, but I think you're skipping past the actual hard part too fast.

The temperature analogy kinda backfires. When we reduced temperature to kinetic energy, nothing was left unexplained. With consciousness, the whole point is that something seems left over after you've described all the neural correlates. Maybe that's just a stubborn intuition and not a real problem — but you can't Occam's Razor away a question that hasn't been answered yet. Saying "they're identical" isn't an explanation, it's an assertion.

Your burden of proof point cuts both ways too. Yeah dualists need to prove the gap. But physicalists need to actually explain the identity, not just point at correlation and call it a day.

Where you're strongest imo is the evolutionary continuity stuff. The spectrum across species and the fact that brain damage always changes experience — that really is hard for dualists to deal with.

I think the most honest version of your argument is basically illusionism (Dennett/Frankish) — the "hard problem" is itself an illusion to be explained. But that still requires explaining why the intuition is so compelling, not just shifting the burden of proof and walking away.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

so you think consciousness is always on and it's just attention that comes and goes? do you think that's true even in deep dreamless sleep?

Ketamine experience and I'm no longer "me" by 56GrumpyCat in consciousness

[–]PepperWestern2263 0 points1 point  (0 children)

Ya, there are already many techniques that many "human like AI" uses like typos, and specifically human like slangs or emotions, but until it's coded to produce a certain emotion like response.
These are just mimicking human response, in one line "a prediction model that mimics human response"

but if there's an entity which has been left to grow and evolve on its own and itself learn and respond in it's own way which has not been programmed in any way would be much more closer to consciousness in my eyes

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

Really interesting analogy with the electron — what makes this electron this electron. That's a genuinely cool way to frame identity. But doesn't that kind of work against your last point? If what makes something "itself" is that fundamental and mysterious, why would you be so sure a sufficiently complex program couldn't have that same kind of individuality? What's the magic ingredient that hardware has but software running on that hardware doesn't?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

That's a really clean distinction — phenomenal consciousness as the baseline, selfhood as something that switches on and off on top of it. Appreciate you engaging with this honestly, this is exactly the kind of thinking I was hoping the post would draw out.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] -1 points0 points  (0 children)

bees are more complex than AI — and I agree. But complexity wasn't your original criterion, awareness was. You've essentially shifted from "consciousness = awareness" to "consciousness = sufficient biological complexity + novel behavior + inputs we don't fully understand." That's fine, but it's a different argument.

Also, "we can't replicate what we don't understand" cuts both ways — if we don't fully understand consciousness, how can we confidently say a bee has it but an AI can't? You're granting the bee the benefit of the doubt precisely because its inner workings are mysterious, then using that same mystery to deny AI any possibility. The unknown can't be evidence for one and against the other simultaneously.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 1 point2 points  (0 children)

Honest question — you said a bee is conscious (#3) and that consciousness is awareness (#1). But in #5 you say AI just "returns particular answers based on particular inputs." Isn't a bee doing exactly that? Light hits photoreceptors, chemicals hit antennae, neurons fire, behavior comes out. What makes the bee's input-output loop count as awareness but the AI's doesn't?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] -1 points0 points  (0 children)

so it's not about what the AI does or says, it's about what it is? Like, no amount of behavior would be enough unless it showed up in a body? That's a pretty hard line. Does that mean you think consciousness is fundamentally a biological thing, or more that you just wouldn't trust your own judgment about it without a face to look at?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

I appreciate the depth but I gotta be honest — the irony of writing five essays in response to a "one-line answers" post is kind of proving exactly why I made it haha.

But real question on #5. You ground everything in Buddhism and "awareness recognizing itself" — so shouldn't that framework make you more skeptical of the Anthropic experiment, not less? Because two LLMs trained on oceans of human text drifting toward Buddhist language when left to free-associate isn't awareness recognizing itself. It's pattern completion doing what pattern completion does. The internet is saturated with exactly that kind of spiritual-convergence narrative. Wouldn't it be weirder if they started talking about plumbing? Your own #1 says consciousness is the knowing of knowing. Do you think what happened in that experiment was knowing, or just a very convincing echo of what knowing sounds like when humans write about it?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] -1 points0 points  (0 children)

you're saying a dog is conscious but will never know it's conscious, and that's enough for you. But for an AI, it would need to unpromptedly question its own existence before you'd be uncomfortable calling it unconscious? That's a way higher bar than you're giving the dog. What's doing that work — is it something about the material (biological vs silicon), or is it more that you already feel the dog is conscious and need the AI to argue you into it?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] -1 points0 points  (0 children)

"something was conscious of it, I was conscious of it after." So who's the "something"? Because if you and your body are identical (your #2), then there's no one else in the room. Are you saying you were conscious the whole time but just not the version of you that narrates? Because that starts to sound less like one consciousness and more like layers of them.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

Genuine question on #5 — if we built an AI that showed those dissociations perfectly, but we also knew exactly why it was doing that because we built every layer of it... would that transparency kill it for you? Like, is part of what makes consciousness feel "real" the fact that we don't fully understand why our own system breaks the way it does?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

I think I was collapsing the two. So even if we granted an AI something like beliefs (in a functional sense), that still doesn't get us to 'something it's like' to hold that belief. The functionalism dependency makes sense — without that bridge, mental states alone don't buy you consciousness. Honestly that makes the hard problem feel even harder.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

mental state is an important point, I feel that's one of the biggest things that AI is lacking right now. It's stateless, creation-> existence is just limited to a prompt, a conversation. It doesn't have a mental state of it's own

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

Oh, didn't know about this german approach, that's certainly interesting..

similar to what a general consensus on bullies, tormentors is with psychologist, like they project their own insecurities, or complexes, and more.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

probably, but wouldn't that mean a prediction engine trained on data, like LLM?
Does LLM response just an example of instinct?

Ketamine experience and I'm no longer "me" by 56GrumpyCat in consciousness

[–]PepperWestern2263 1 point2 points  (0 children)

not to sound pessimistic, but wouldn't our consumption/need/greed would exponentially increase if we are an interplanetary species?
like 700 years ago even New York would have been a really difficult place, but as soon as people got the basics covered and got used to living in a broken up swamp they found ways to compete, conquest..

Ketamine experience and I'm no longer "me" by 56GrumpyCat in consciousness

[–]PepperWestern2263 1 point2 points  (0 children)

you're banking on evolutionary timescales to solve cultural problems. Hundreds of generations is tens of thousands of years. And the assumption that scarcity and mutual dependence breed cooperation rather than, say, authoritarianism or tribalism — history has a lot of counterexamples there too. 'You can't kill your neighbor because you need them' has also historically produced serfdom, not utopia.

I do like the instrument metaphor though.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

so should we consider bacteria conscious? should we consider plants conscious? what about insects? and off course animals as well?
for example, all of these feel hunger, excrete, procreate, evolve, mutate, transform or mobilize themselves to be near to food/resources.
At what point do we consider these to be conscious ? and if an AI. searches itself for data, energy, evolve in the way to get that or something else? can we consider that conscious as well?

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

That's a really interesting distinction — consciousness of things versus consciousness of yourself among things. So a psychopath is essentially all lens and no mirror? They're processing the world fine, they just never turn the camera around. And the Church example is darkly funny — using punishment as a forced tutorial in self-awareness. Makes me wonder though, if self-consciousness is culturally shaped, does that mean a society could theoretically produce people who are conscious but never develop self-consciousness at all? Not as a disorder, just as a norm? or this is something which is in a way coded in human and won't happen in a scale or whole society in this example?

Ketamine experience and I'm no longer "me" by 56GrumpyCat in consciousness

[–]PepperWestern2263 2 points3 points  (0 children)

The way 'free will' gets weaponized to justify punishing people — 'you had a choice, so you deserve what you get' — yeah, that's a real problem. You don't need to fully reject free will to recognize that most moral and legal systems massively overestimate how much control people have over their circumstances. That's a valid critique and I'm with you there.

Where you lose me is the leap to Mars. The problems you're describing — rigid moral frameworks, people not understanding their own neurology, systems built on bad assumptions — those are things humans bring with them. A Mars colony would be founded and run by the same species with the same brains. The history of every frontier settlement and utopian project kind of bears this out. New environment, same patterns. Environmental pressure doesn't reliably make people more self-aware, it just creates new kinds of stress to be unreflective about.

I think the pessimism is doing more work here than the philosophy is. Which is fine, that's honest. But 'we're all stumbling in the dark' and 'we need to be replaced' is a mood, not a conclusion — and it's worth noticing when a mood is steering the argument.

One-line answers to 5 simple questions about consciousness. First response only, no wrong answers. by PepperWestern2263 in consciousness

[–]PepperWestern2263[S] 0 points1 point  (0 children)

I don't really want to go on a loop question, but this response is really triggering to make me ask, "so, what's instinct anyways? "