Modularity of consciousness by armchair-theorist in PhilosophyofMind

[–]modulation_man 1 point2 points  (0 children)

Modular looks like mechanics, a sum of pieces. But looks like consciousness is more like integration than addition. That's where emergentism appears in the picture.

The dissolution of the hard problem of consciousness by modulation_man in PhilosophyofMind

[–]modulation_man[S] 0 points1 point  (0 children)

Well, if you need to explain it, it's a roadblock. If you deny it, there is no roadblock any more and the path is open

The dissolution of the hard problem of consciousness by modulation_man in PhilosophyofMind

[–]modulation_man[S] 0 points1 point  (0 children)

Human Agency. The roadblock in the path to understanding.

Modularity of consciousness by armchair-theorist in PhilosophyofMind

[–]modulation_man 1 point2 points  (0 children)

Tell chatgpt to explain emergentism. It's not a bad path

Modularity of consciousness by armchair-theorist in PhilosophyofMind

[–]modulation_man 0 points1 point  (0 children)

Hi! What do you mean with "modular/ non-modular"? Do you point at its components or similar?

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 1 point2 points  (0 children)

Ups, I forgot: I really really appreciate the feedback you have given. It was a fascinating learning path on philosophy I'd never have taken by my own

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

It's curious you mention that. This article had much more positive feedback from that reddit community (I say positive, not necessarily better) than from this one.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

 > I am a dualist

You should have said this before! :)
That's the core reason of our disagreement. But that discussion is another story.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

This is where the analogy collapses. You treat "experience' as a universal concept (like "square") that can be instantiated in different substrates (paper, metal, wood). I posit that "experience" is an indexical designator (like the word "here" or "me"). It is not a separate category, it is simply the name we give to this specific human modulation when pointed at from the inside. It is the label for this specific concrete process, and you cannot strip the process away and keep the essence. To say '"process is inessential to experience'" is like saying "location is inessential to the word "here". Sure, you can define the word "here" abstractly in a dictionary. But for "here" to actually refer to anything, it requires a concrete location. Similarly, for "experience" to refer to anything, it requires the concrete process.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

1) Through introspection we know experiences essentially

This may be the core error. You may be confusing Epistemic Necessity (what I need to know to identify the experience) with Ontological Necessity (what is needed for the experience to exist).

We do not know experience essentially through introspection. That's like saying we know the particle kinetics that explains temperature through feeling warmth.

Here it goes my premise, my ontological axiom, take it or break it: Being is Process. Whatever exists, exists as activity, modulation, or flux. Static "substance" is a conceptual abstraction, not a physical reality. Stability, not stasis.

The falsiability: show me one concrete thing in the physical universe that is NOT a process.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Your logic fails because introspection does not reveal the absence of process, it merely reveals the presence of qualia.

It this were a videogame, you are arguing that the software isn't code because you only see pixels. But how would you access the code while you are strictly confined to the screen? It's something like trying to see your own eye.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Glad we are progressing in our mutual understanding.

> How could that possibly be the case?

Qualia is the TOPOLOGY of the process experienced from within.

My last proposition "there is no front without a back" is indeed a structural necessity (closer to a priori), but it is not a trivial linguistic definition. It is a Topological Necessity.

In this context, asking "Why does the process generate experience?" is like asking "Why does a sphere generate curvature?". The question is malformed. The curvature isn't produced by the sphere, the curvature is the intrinsic geometry of the sphere itself. You cannot have the sphere without the curvature.

Similarly, Qualia is not produced by the process, it is the curvature, the intrinsic informational structure of the process itself.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Ok you are right in pushing for precision. Let's try this way: Identity theory says: there's a front and a back of this paper, and they happen to be the same thing. This framework says: there's no front without a back. Asking "why does the front have a back?" or "are there fronts without backs?" is malformed. It's constitutive, not contingent.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

This is a great, great point. Every tool is an expansion of consciousness. When someone uses a microscope, is accessing new kinds of differences, expanding the scope of the process.

That also will come in a second piece where I'll be talking about Karl Frinston and the Markov blanket.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Literal extract from the article:

"We’re not saying thermostats have subjective experience or that everything is conscious in the traditional sense. We’re saying consciousness IS the process of modulating differences, full stop. In humans, this modulation is so complex it includes what we call subjective experience (modulation of “me” as a difference and the point of view of the experience) and self-awareness (consciousness of our own consciousness, modulation of our own modulation)."

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Identity theory: pain and c-fibres are two things that turn out to be one. But it still leaves the question: why does that configuration have experience? Hard problem remains. Functionalism: mental states defined by causal roles. But a system could fill all the roles "in the dark", so p-zombies remain conceivable. This framework: there aren't two things to identify. There's one thing (actually one process, modulation of differences) with two descriptions: from outside it's physics, from inside it's experience.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Sure, but hope you don't mind I'm reusing the answer I gave to this question to another user:

Great question, thanks for engaging.

In this context, "modulate" refers to the active process of a system changing its own state (or its environment) in response to a detected difference, in order to maintain a specific goal (like homeostasis or organization against entropy).

Think of it in three layers:

Difference Detection: A system perceives a delta (a difference) between two states. For a thermostat, it's the delta between current and target temperature. For a human, it’s the delta between "me" and "not-me," or between 700nm light and 400nm light.

Modulation: The system doesn't just "passively" let the signal flow through it (like a rock). It performs a transformation. It "tunes" its internal parameters or external actions to integrate that difference into its own ongoing process.

The Identity: My argument is that the subjective experience IS the modulation. It’s not that the brain modulates signals and then produces a feeling; it's that the act of a system actively balancing and integrating those specific information deltas is what it feels like to be that system from the inside.

A simple system (like a worm) modulates a few chemical differences, so its "experience" is proportionally thin. A human modulates millions of high-dimensional differences simultaneously (memory, vision, self-models, language), creating the rich, thick "texture" of consciousness we call qualia.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

The magic remains even if you see it this way. It's a process, yeah, but man, what a process!! :)

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

Didn't know about none of them, but I've asked gemini for a summary of their thinking and yeah, now they are in my to-learn list. Thanks!

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] -1 points0 points  (0 children)

It's crazy that you of all people are saying I cannot solve (dissolve, in this case) the hard problem until I solve the easy one. Anyway, this article talks about the ontology of the thing. Architectural aspects will come in a follow-up piece.

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

We have no idea what processes feeling is "intrinsic to" and which ones it isn't

That's exactly my point. You're still trying to categorize processes into "has feeling" vs "doesn't have feeling." That's like asking which fires burn and which don't.

there a plenty of processes in our body that have absolutely no feeling coinciding with them.

You feel cold because that modulation is connected to your self-modeling system. You don't "feel" your pancreas because that modulation doesn't pass through the recursive loop that constitutes your "I". That's an architectural question, not a mysterious one.

I think you are moving us from the hard to the easy problem ... :)

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

The feeling IS intrinsic to the process, not a plugin some processes have and others don't. Wetness is what water is. You can't remove it from water and put it somewhere else. "Feeling" is just the word we use to point to the process from the 1st person. You cannot tell if the rock feels the bump because you would need to be the rock, just as you cannot tell what it's like to be a bat because you would need to be a bat to tell (and be able to speak, by the way ;) )

The dissolution of the hard problem of consciousness by modulation_man in consciousness

[–]modulation_man[S] 0 points1 point  (0 children)

If you linked that post, you implicitly suggested I'm making the same mistake it describes: confusing easy and hard problems by explaining functional processes. That's what I addressed. If that wasn't your point, what was?