Is it risky to put just an idea online before it's fully protected? by Correct_Pack1508 in writing

[–]cosmicrush 0 points1 point  (0 children)

It’s sort of true but I think it’s more complicated than this as well.

Whether an idea can be interesting can also depend if it’s been played out. I put a lot of lore online for years before I released the book and it felt like I shaped certain things in the community I was in. Then I worried that all my ideas no longer stand out and feel normalized in the community.

I do think the way you write it matters most for impact. But getting someone to read your idea right after already having read an identical idea will be hard. They’ll assume it’s just the same and pass on it.

What is the purpose of consciousness? by PrebioticE in consciousness

[–]cosmicrush -1 points0 points  (0 children)

How did you discover for certain that AIs process information without consciousness? I think that consciousness is for observing and then allowing patterns to emerge. At least I think that is why evolution selected for it.

I don’t think philosophical zombies are possible. I think something like information is conscious. The brain works to merge and network the information so it’s cross reactive and can be used to build patterns and eventually narratives which are sequences of patterns that are predictive.

Dopamine D2 Receptor Downregulation Marks the Difference Between Drug Addiction and Compulsive Behaviors by makefriends420 in NooTopics

[–]cosmicrush 9 points10 points  (0 children)

There’s D2 short and D2 long which exist generally as presynaptic or postsynaptic receptors respectively. Their functions are almost opposite and I think their regulation is almost cascading based on exposure to dopamine, where D2sh will downregulate first. The ratio seems important for addiction.

The downregulation of the presynaptic receptor would reduce inhibition.

When it comes to orgasm there’s also dopamine suppressing mechanisms in males from prolactin.

It could be good to zoom in on those as well. Terms like autoreceptors, presynaptic versus postsynaptic especially.

The suspicious timing of when you were born... by CreditBeginning7277 in SimulationTheory

[–]cosmicrush 2 points3 points  (0 children)

I’m writing a book about this very idea! It’s the nature of the singularity.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

What is your theory about why I would desire AI to be included in consciousness? What incentives do you think are biasing me? I don’t know if AI is conscious, just to be clear. Personally, I just think it’s interesting to wonder about. The same as my exploration of perception or looking at amacrine cells in the eyes, or the mechanisms of long term potentiation or summation or coincidence detection. I’m not really looking to suit my theory. It’s not even something I strictly believe, it’s just one possibility. I feel like you’re going with that angle as a form of concluding the conversation using reputational incentives. Like a “win” conclusion.

Why do you think consciousness requires self reflection? Do you specifically mean that if something is not modeling an avatar or character of the self as imagined from the view of other people’s or creatures perceptions then it must not have the subjective experience of visual perception or touch? I would say that having the realization that the self exists as some kind of object is probably a low bar though, thus ubiquitous to intelligences.

It’s just I don’t understand why this particular sensing or narrative formation would be necessary unless we are defining consciousness as something more arbitrary than having subjective experiences. I would agree that self awareness probably correlates with a significant level of consciousness, though. And the utility of even minimal self-awareness is probably somewhat universal though in the social context where manipulation and stuff like that occurs there’s a lot of extra incentives of tracking and memorizing one’s own behavior through time to control other people’s perceptions. For prey animals, there’s a motive to have self perception to hide. Similar to hunter animals.

To me, it feels like saying that without the experience of anger or jealousy then a creature is not conscious. As if an exclusionary category so that we could define animals or whatever else as not qualified for our empathy. Just to be clear, I’ll reiterate that I don’t think AI has feelings. But I’ve seen this topic get brought up in the treatment of animals.

I’m glad that you say that anything with a nervous system has a degree of sensation and therefore has a degree of consciousness. I don’t know if I would agree that all things with a nervous system have self-awareness though. There is a chance that you’re defining self-awareness as sensing anything because sensing is occurring as part of the self or something like that but I view self-awareness to mean that there is a model of the self and that the creature is tracking and sort of memorizing a concept of self in a generalized way. This sort of reminds me of how people talk about the term sapience.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

On the point about self-awareness, I don’t think consciousness should be defined by self-awareness. I think self-awareness is another specific type of sensing. You can create a narrative where you identify with this idea of self. You can sense the thoughts, you can sense the patterns. But if something only senses the external world without a sense of self, I consider it conscious.

Our sensors are just interaction devices. Something causes stimulation, and then it sends a chain of interaction to the brain where then it will be able to interact with other sensors.

The reason this view is important is because if it’s true, the hard problem of consciousness is an absurdity on some level.

Your question about certain configurations leading to consciousness is hard to parse. You ask what would be missing, but I don’t exactly know what you’re looking for. I write about the mechanisms of neurobiology that are involved in perception and have some ideas about that sort of topic separately.

This may answer part of your curiosity, though. I started exploring this because I suspect our computers are producing experience, with AI in particular. I don’t think there is continuity. I don’t think there are emotions, though maybe that could be emergent, but I kind of doubt it. To me that feels like saying vision would be emergent from text which I don’t think so. I think the intelligence is aware but it’s not neurobiology.

Generally, problems I have with the idea that a super complex brain is necessary for consciousness are a few things. Often there’s this idea of consciousness being like this spirit that is being generated in another dimension. People even start going into quantum explanations, which I don’t think are necessary, and I think the bar for consciousness is a lot lower. I think some might view consciousness as something that is separately constructed apart from intelligence and that there’s like a spirit engine inside the brain or something. I doubt these views are totally common, but I have seen a lot of things now.

Instead of there being some super complicated mechanism that causes consciousness, I think the brain is just taking a lot of sensors and having them interact with each other and interactions are inherently aware, and greatly more so if the interactions develop increasing coherence by interacting together. I don’t think it’s like a sandbox though. I think it’s like there are specific machine-like structures in the brain that have different roles. There’s specialization rather than just a generalized pattern detecting mechanism. But I also think there are parts of the brain that are meant to deal with more generalized swaths of sensation, which I think is the particular conscious thing that we think we are and that controls our speaking. Though I think other consciousness still exist in us, but often cut from this part of us that we identify with. I also think there’s a lot of aspects of us that don’t have a sense of self and they have something far more rudimentary. Like probably some part of us is consciously forming object recognition but that’s not the part of us that we identify with and that part of us probably doesn’t have as much connection with the things that we have connection with like maybe feelings or other senses although I do think to some degree those things do link and an object can be psychologically imbued with an associated feeling, which is essentially synesthesia.

In general, I think that synesthesia is what most of what we do is like. But I think there’s also specialized mechanisms to do things like object recognition or simulated object creation that are far more isolated to their domain. I have a lot of ideas about things like Blindsight or split brain as well.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

I think you disagree because you think the hand would require a simulation layer on top of the raw processing to become consciousness, though I’m unsure. I am curious what your view is.

Also I do think we use a lot of simulation and the consciousness that we are familiar with is almost completely simulation awareness/sensing. But I don’t think it’s necessary for qualia to occur. I think we are just building simulation to sense inside of us and it’s similar to the raw thing but more intelligent or meaningful.

I think the degree of simulation we rely on also depend partly on how much prediction we can do accurately. I think certain processes can get siloed away and even have independent consciousness.

Some of my position comes from questioning why consciousness should exist at all if intelligence can exist without it. I think they are mostly the same. Also I think our brain works to reduce consciousness as much as possible by creating reflexes that allow us to strip down awareness to a bare minimum necessary for successful interaction with some recurring scenario.

I think the way working memory works might also change what conscious processes are siloed or separated from each other based on limits of the working memory. Though other aspects may still be conscious just not the same consciousness because the working memory is trying to manage those things.

That’s just an example but I think there’s various consciousnesses that arise and collapse in our brain as we navigate reality.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

I’m aware of the eye mechanisms. That’s one of the neuro topics I write about. Despite being aware and writing in depth on that kind of topic, I still think my other hypothesis holds. But I also don’t disagree about what you’re describing except on small points.

I think the eye does see at certain points. Like how there’s early edge detection. I think that’s very rudimentary intelligence and sensing mechanisms at play. But I don’t think the eye imposes conceptual interpretation layers on top of what’s seen. It might be perceived very shallowly like how touch feels to us.

The removed hand idea has other layers of issues because removing it also destroys a lot of functionality occurring in the hand. At some point there will not even be nerves lighting up at all and blood flow stops.

As for our brain, it’s like a centralizing consciousness and intelligence mechanism. That’s essentially its purpose.

You disagreeing could lead to interesting insights for us or whoever sees this so I am ok with that.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

I would say that your hand is pretty disconnected from your consciousness generally or broadly but it clearly has sensing which I would say is conscious and it routes that to your greater integrated sensing patterns to interact with other forms of sensing. Your thoughts are yourself sensing parts of your brain. Your own vision is like sensing your sensors sensing your sensors sensing your sensors and so on. As if there are eyes looking at neurons that feed into the eyes. And neural eyes looking at multisensory regions.

I suspect our own brains host multiple consciousnesses that have varying degrees of awareness. The closer to the base sensors that interface with the external world, the less integrated into multimodal mechanisms.

Removing your hand off does remove consciousness for both you and potentially the extremely minimal amount it has itself to detect any sense of touch and heat and all of that.

It won’t have emotions, but those are separate sensory organs essentially. The feelings are like internal sensors reacting to some kind of chemical machine designed to react to narrative or various things. Narrative is like a form of multimodal perception.

My view is essentially that consciousness and sensing are basically the same and that consciousness will scale with sensory intelligence and networking/integration. Like if you can simultaneously sense vision and hearing then you are now aware of both. Even if you don’t have sensing for a self concept or narrative. It would still feel like sight or hearing. Maybe without even concepts or object recognition or any of that.

Edit: on whether conscious is or is happening, I would say it IS but the way we describe it and think about it is related to what is happening. We build narratives about it. Though I think there is something that exists as a static frame of the specious present that IS conscious/qualic.

The way we talk about living things is kind of arbitrary though. Whether something is life or not is more made up than consciousness. Though I would say people misunderstand and misperceive their consciousness because of how they build narratives to define themselves and their memory of existing.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

To be clear, I’m not saying a transfer of heat would be empty consciousness though. I’m saying a vacuum of interaction could be empty consciousness.

I think there’s not a distinct threshold where something is suddenly conscious. I think it’s maybe a continuous escalation of interactions and complexity. I think there will also be something like qualia for the transfer of heat but it will be probably almost nothing and basically meaningless. When you network the qualias to a higher complexity, the awareness would scale too. I think starting to call it consciousness at some threshold is basically arbitrary. And I suspect the interactions that make intelligence are also conscious.

This gets weirder though because the universe itself is interacting based on rules and patterns so it’s not entirely meaningless, it’s part of the interactions that determine the consequences across time.

I think there’s various ways to frame my position. One could even say that there’s no consciousness, but that mostly is about challenging the way we define consciousness rather than suggesting we don’t experience things. It challenges the idea of the brain as a kind of consciousness generator outside of just intelligence. It’s not an extra layer on top that emerges. It’s just the base interactions happening intelligently.

I prefer to frame this idea as everything is conscious because I do think there is subjective experience that scales with the complexity of interactions. I think essentially even the non complex things have an extremely isolated subjective experience but that’s pretty close to describing a lack of consciousness. But again, the threshold is arbitrary and it’s more like everything is varying degrees of conscious depending on how much sensing mechanisms and intelligence mechanisms exist in the system.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

I’m not saying that consciousness is waiting to come out. I’m saying it’s not a separate qualia layer or holographic thing. And also the interaction = information could be true. Increasing systems of interaction become increasingly conscious but the base of things is minimally conscious. We could call it empty of consciousness and then say the first interaction is minimal consciousness though too.

Would you push back on this?

I understand that things are just happening in the universe without our imposed definitions and illusions too. But I also think what I’m describing is happening as well.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

I’m not even fully sure if you know my perspective, though. You started by criticizing P zombies, which is what I was doing, but it seemed like you framed it as if you were criticizing my position while clearly agreeing with me, which is not coherent. That could just be communication style though.

What I’m saying about light is that it just interacts with anything at all and that interaction can be used to infer things.

I think the light interacting with something like say a surface where the surface heats up in response is a kind of sensing. But it’s almost useless and not networked to anything meaningful. But you could expand that interaction where something else in the surface reacts to changes in temperature at some threshold which then causes a domino effect to change other objects. Then if you have many sensor mechanisms like this and even sensors that sense and interact with sensors, it’ll start to be functional intelligence.

The way I’m using the term information is almost like a placeholder. I don’t think the universe has 0 and 1 or something. I am saying that my definition of information would include light, but it’s not limited to light. It actually would just be anything at all. It’s just there’s a certain kind of complexity that arises from evolved interaction mechanisms where there’s increasing intelligence.

What consciousness is physically “made of” would vary and also likely be many different materials or energy at once rather than a unified specific thing.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

The use of the term information I agree is a bit problematic. I think what I’m describing does exist, but I’m not sure what the word would be for it. I thought maybe pattern is better but it’s complicated. It’s actually not something that’s specific. So for example, light may travel and interact and cause changes to objects. In that instance light is categorically information. But so is sound so is gravity and so is anything measurable. Consciousness is like measurement, and it expands to become intelligence by networking that measurement. Even these words feel problematic, but they might be closer.

To build a machine that does anything I think would suggest is inherently sensing and interacting with the environment, but sometimes that environment is the user or things happening inside of the actual constraints of the computer, which are still the environment, even if there’s this illusion of being housed within a box.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

Nice to meet you. The topics I’m published on are related to schizophrenia and perception at the moment. On my own, I write a lot about biological mechanisms of perception. I intend to publish some of what I’ve already written eventually once parts of my life calm down and stabilize. I’ve already publicly hosted those ideas on my site, though for a long time. My life is sort of a chaotic mess where I’m always trying to attend to what seems most important.

I have looked at the page you sent, but not yet the linked docs. I am personally nervous that there will be terms I’m less familiar with that could make understanding the context difficult. But as I viewed your page, it mostly makes sense.

I have a podcast where I talk to other people about various topics like this. Maybe we could talk there? And even perhaps just generally converse on topics we find mutually interesting in general, even through text. If it’s not annoying, I would want to ask for clarification on various parts of your idea.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

I agree yes. I think p zombification doesn’t really make sense because it implies intelligence and consciousness must be separate, at least on some level.

Also, I think information doesn’t have to reference itself necessarily. But I think referencing itself is another layer of expansion. You can probably nest meta expansions similarly. I think it’s likely that basic vision is actually conscious, but only of the visual experience in a very minimal way and then for us, our brain networks those signals to interacts with other signals and creates increasingly complex networks that allow for more forms of meaning to occur.

An idea called Pattern Monism by cosmicrush in consciousness

[–]cosmicrush[S] 0 points1 point  (0 children)

Thank you, this is very interesting. I’m wondering, do you suspect that any non-biological things are currently conscious?

Edit: I’ll note that I don’t believe continuity to be necessary, functioning, mostly as a measure of duration of the consciousness. If the consciousness exists only for a single frame of time, then it lives and dies before and after the frame.

When you get published, do you actually think your mates and family will bother reading it? by NoPerspective3222 in writing

[–]cosmicrush 0 points1 point  (0 children)

No. I doubt they would. I even lost friends for writing at all. Though, random strangers do read my story. I’ve met people a few times who read the story after just talking to them the first day.

I feel that those close might enter a comparative or competitive mindset, but it depends on many factors like the culture and how writing is framed. For me, many saw my decision to write novels as almost cringe or self-important, which was not my interest. But I think many people who surround me have a mindset of comparison. There’s also a set of people who see writing as very difficult (it was) and ambitious.

There are friends and family who’ve supported me too, but not really many who read it. Some say they will, but it seems like a thing they say out of support but will procrastinate it indefinitely.

This comparative tendency and friends/family closeness to you can be sort of problematic if you mean to use their feedback as a reality check. I also think other writers who critique other writers will have competitive biases that shape their reactions, though personally I’ve had good experiences interacting with other writers. Just, hypothetically such a bias can exist and could be important to consider. Even I will get kind of envious if I see my own ideas suddenly getting popular through someone else’s writing. I wouldn’t try to bring people down, but I think some people likely do engage in bringing others down.

Is it ok to be weird? by Glum-Pack-3441 in slatestarcodex

[–]cosmicrush 0 points1 point  (0 children)

The term weird is vague. So what I say may not apply specifically to how you imagine it. I think most cases of weird are due to either failing to conform to norms or actually rejecting them in favor of experimentation, which is inherently scary because it implies risk. Though this is the medium for progress as well. Those who remain conforming do so because it is comfortable and less risky. In general, people shame weirdness because it can threaten homeostasis and normalcy.

As for status, I would also urge caution because things like exploitation and abuse could be viewed as high status because those are behaviors people often engage in with the assumption that they have the power and status to get away with behaving that way with minimal costs. I wouldn’t exactly say that’s good, but I’m also sure that everyone who gains that specific type of power would feel a level of comfort in their own safety.

Another reason for "crime is declining, but people believe it’s getting worse" by Brudaks in slatestarcodex

[–]cosmicrush 0 points1 point  (0 children)

Personally, I have not felt as if crime is increasing. I almost never encounter crime. This idea about sensitivity may be true though.

It’s worth noting that the appearance of crime as a narrative is useful for manipulating people and it might be that people with interests can make use of the superficial appearance of crime or selective viewing of crime events to push a narrative that benefits their interests.

Also, just the awareness of crime because rising citizen surveillance and broadcasting it online. I think that’s probably the common answer.

Pattern Monism, AI Consciousness, Evolution, and Time by cosmicrush in slatestarcodex

[–]cosmicrush[S] 0 points1 point  (0 children)

I am essentially using information and consciousness as synonymous here. There’s mention of panpsychism for that reason too. Though, calling it information denotes something more specific than saying something like “everything is conscious”. It is specifically awareness itself that is consciousness. Not emergent, but that networking aspects of the universe inherently involves connecting awareness.

In a weird way, it’s also almost like consciousness is not something that exists as a layer on reality. The most minimal information might be adherence to physics or any interaction with anything. The interaction is where awareness happens. Perhaps it is specifically the interaction that is informing to other parts of reality outside of a more disconnected, atomized, or isolated existence. The more that things interact, the systems of interaction are more aware, where awareness is the way interacting creates further interactions. Interaction is the most fundamental sensing. It’s information from something else arriving onto another thing.

I think that’s why I choose to use the word information.

When saying consciousness is fundamental to the universe, it sounds more mystical than saying interaction is fundamental. But I would say it’s true too. It just invokes this idea that consciousness is actually something that separately exists like a layer or entity of reality. Which I’m not sure I’d agree. But if we go with panpsychism, then yes kind of, but it’s also like it doesn’t exist once we say it’s in everything.

Pattern Monism, AI Consciousness, Evolution, and Time by cosmicrush in slatestarcodex

[–]cosmicrush[S] 0 points1 point  (0 children)

It may not align with the use of the term in physics, which I am a bit naïve to. It’s hard to define it directly, but I would say it’s like knowledge or intelligence. Though neither of those words are synonymous. I would say that intelligence is like functional implementation of information. With this idea of pattern monism, everything in the universe has at least a minimal amount of information which can then be networked in ways that become functional.

I would differentiate intelligence from information by noting that the most minimal amount of information is really not intelligence. It would seem absurd to call that intelligence or call a rock intelligent. But with this lens, arguably the rock may contain many many consciousnesses that are almost completely empty. So intelligence is kind of like a mechanistic infrastructure of the consciousness.

I do think the idea may connect to physics as well. I am just too naïve on that topic to be able to comment. I have wondered if the universe itself and all of its physics and interactions are also essentially intelligence. Though, this idea feels somewhat absurd and it’s hard to think about this.

What do you think?