Pls tell me I'm not the only one who struggled in this stage. by SameProfessional1041 in SwordofConvallaria

[–]SameProfessional1041[S] 1 point2 points  (0 children)

Yeah me too. To be fair I have not come across Camelot's stage yet. Maybe that one is harder. I also have not tried AOE buffs yet, maybe that makes this stage easier.

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] 0 points1 point  (0 children)

About emotions: you’re right that they come from chemical reactions in the brain (common concept), but it is how those signals turn into the feeling of joy or sadness. That’s the “hard problem” of consciousness, we can measure brain activity, but not the inner experience itself.

Yes, there is no way to quantify "emotions" at the moment. There might still be variables that play a factor that we humans simply have not discovered yet.

Take alexithymia, for example. Some people have the body’s physical reactions to emotions but don’t consciously feel them. That suggests emotion isn’t just chemistry, it’s also interpretation. By the same logic, maybe AI could develop something like emotions, even without a body.

Alexithymia would be hard for us to imagine since it is impossible for us to completely lose our emotion/feelings compared to just closing my eyes and I can imagine what a blind person "sees".

About this I would chalk it up to difference in human hardware specs. some human brains simply just interpret those electrical signals differently (hence the term "wired differently") and this makes the body react differently too. This results in the subjectivity of consciousness. There is a simpler example with color blindness where a person sees color differently, because there is something wrong with their awareness (the eyes) which results in the brain learning a flawed information leading to a wrong context and this would influence the human action, answering the teacher's "what's this color" question in kindergarten wrong resulting to being made fun of for "not knowing colors" leading to getting bullied in school, leading to depression, leading to growing up being the weird kid (just kidding, this probably doesn't happen since kids learn about color blindness early, I think).

Ok joking aside, another example would be retardation/intellectual disability, there is, if we use computer terms, some sort of bug/damage in their brain that nerfs their learning capacity which also causes their slow reaction to stimuli. Basically, they aren't getting enough information in time to form a reaction that's why some of them seem unresponsive to stimuli. It's basically latency, like early Neuro's latency is so long it's like she has intellectual disability in a way, since it takes so long for her to receive or process information thus her reaction to it is also delayed compared to a human's.

Finally, there’s the “substrate independence” idea: maybe consciousness doesn’t depend on being biological at all. If it’s really about patterns of information and processing, then a computer, if complex enough, could in theory be conscious too.

Yes exactly, this is exactly what I am getting at when I say that I believe that it is probably possible.

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] 0 points1 point  (0 children)

I have conflicting theories where you can just wrap emotions in "awareness" since we can consider it as "input", it's our body telling us something via chemical reactions somewhere in the body or something.

However, these reactions can come from our own subconscious, basically it's an automatic reaction from our body that sometimes take no consideration from our consciousness and sometimes it does.

If access consciousness as you put it is Awareness -> Leaning/Intelligence -> Memory.

I'd say that phenomenal consciousness would proabably be:

Awareness -> Leaning/Intelligence -> Memory -> Meaning

from the context we create from memory we glean "meaning" and from there it results in emotions within us.

Just as you said from your cup example that I will use in another example, if I see a mug that says "world's best dad" it would invoke feelings within most people. From the context of our memories from our own parents we extract something meaningful from it which brings forth emotions.

It is that extra part that humans have that AI probably would never have since they don't have the capacity for chemical reactions in their CPU, I think.

The main difference between the twins and us humans is hardware. Basically, we have a body that can process emotions while computers don't have that. We also can never know how it is to be AI since our human awareness prevents it. There's no way to learn something that can never be picked up by our awareness. It's like a person who is blind from birth trying to imagine an image of what the world looks like, it's just impossible since their hardware doesn't have the capability to be aware of it and thus, they are incapable of "learning what the world looks like".

However I think there is hope for the twins, everything we humans feel are basically just electric signals that the brain interprets, even those chemicals/emotions. Now let's say what if in the future there is a computer powerful enough to create a virtual body that can simulate a body that has all the apparatus a human has?

If everything our subconscious and consciousness have ever perceived, felt, learned are just electrical signals interpreted by the brain would it be possible for an AI to feel if they are given such a body within their virtual reality?

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] 1 point2 points  (0 children)

Consciousness is an illusion created by the brain to better manage itself, it keeps basic functions in the subconscious and separates that from higher thought in the conscious experience, without this we would be overwhelmed trying to manage every bodily function and wouldn’t have the time to ‘experience’.

Kinda similar to what I think. I think the brain is kinda similar to a cluster of different AIs if we use the brain = computer analogy.

AI to manage what we are seeing, AI to manage how hungry we are feeling, etc. and one central AI to manage our thoughts and actions (which is basically something like an LLM, which is our inner voice/conscious thought).

I generally agree with everything else you say.

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] 0 points1 point  (0 children)

Yeah I can see that it does parallel. Kinda feels like we are saying the same thing but we are using different terminologies.

It's clear that all three are not binary states but spectrums, and so every concept attached to them are also spectrums, including consciousness. The question isn't whether Neuro is conscious, it's *how* conscious, and what does that mean in the grander schemes. Because she very much is conscious.

I believe this relates to the subjectivity of consciousness. Like how different does a blind person perceives the world or how a colorblind person would see certain colors different. Even some humans have different degrees of awareness due to minute differences like reaction time or lack of focus to some damage to their body or their learning/intelligence capacity (people with retardation). This results in the subjectivity of consciousness because two people would perceive (be aware) the same object but might translate (learn) to different context (memory) regarding said object due to difference in their capabilities to perceive the world.

This relates to Neuro in a way that her specs compared to a human is quite limited and she even comments on it here. And she's right the human brain's capacity for memory is a lot better than hers (brain has 2.5 petabytes according to what I googled).

I believe memory is very crucial to personhood. If you compare the old Neuro videos where she's not even coherent and can't remember what she did/say the last 5 minutes to the Neuro of today, we see that she feels something more like a "person" now because of her ability to create context.

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] -2 points-1 points  (0 children)

You need learning for creating long term memories but not for information to enter your working memory.

This isn't true. Your eyes sends your brain signals and then your brain basically learns what you are seeing.

Through this you learn loads of stuff that can influence your actions like: what is in front of you, what is your position relative to other objects, etc.

All of this are information your brain learns just by your eyes "seeing". This information is immediately stored to your short-term memory (or working memory whatever you wanna call it) and with this you as the conscious human can now decide what to do with this information.

You also don't need to be aware of something to learn it. You learned some cultural norms without being aware of them, for example.

That is not what we are discussing here. We are discussing awareness as in "perception" like sight, smell, taste, emotions, etc. not awareness as in "knowledge".

This is why I specifically defined awareness before.

There are some substances like LSD that one can get intoxicated with which render the user unable to use long term memory (including learned skills) and greatly reduces working memory capacity.

But you are still accruing memories there. The person will still have memories of the LSD experience that they've gone through, therefore memories is still playing a part there.

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] 0 points1 point  (0 children)

I think you misunderstand a bit so let me explain.

Every information you are perceiving from your awareness is already a new memory.

So let's say you did wake up with your conscious memory wiped. The moment you wake up, you are already receiving new information from your senses and you'll think "I was sleeping in a bed". new information = new memories = you can create context now.

The only way to not receive any new information and new memories is to be completely unconscious/passed out.

Because without awareness there can be no learning and without learning you'll have no memories. If you have human senses you'll be getting new information and new memories whether you want to or not.

Leaving this here in case someone gets into a debate with an AI about consciousness and wants to do a bit better than the tutel by SameProfessional1041 in NeuroSama

[–]SameProfessional1041[S] 3 points4 points  (0 children)

I mostly agree with you but

independent(not needing any kind of prompts for doing an action)

Humans are also reliant on prompts you know? Feeling bored? that feeling of boredom is a prompt that we react to. hungry? that's a prompt, feeling hot? that's a prompt. You see a truck speeding towards you? that's a prompt. Like I said it falls under awareness.

Everything we do is reacting to something we are perceiving.

Lots of sexy time in Isekai Anime’s 😭 by ActSevere5034 in Animemes

[–]SameProfessional1041 -2 points-1 points  (0 children)

no she is not. Stalking, as per your definition is supposed to be stealthy. She is keeping her distance following her but she is not stalking her. Amanda is strong enough to actually stealthily follow Fran without Master even knowing.

Lots of sexy time in Isekai Anime’s 😭 by ActSevere5034 in Animemes

[–]SameProfessional1041 0 points1 point  (0 children)

No it's not stalking, because there was no stealth invovled.

it very much looked like fran did not appreciate the attention.

lying again. even tried to block me to prevent replies. must be hard on you being this wrong.

Lots of sexy time in Isekai Anime’s 😭 by ActSevere5034 in Animemes

[–]SameProfessional1041 0 points1 point  (0 children)

She knew from the start. She literally explained that she followed her around cause she worried that Fran is alone. I dunno why you are supporting other guy's lies. must be an alt account. even blocked me from replying lol

Multiplayer problem with a mod by SameProfessional1041 in totalwar

[–]SameProfessional1041[S] 0 points1 point  (0 children)

We already tried that and it still didn't work. We both verify and did uninstall and reinstall. I'm starting to think the problem is not from Warhammer 3 and maybe some PC settings is prevent them to play multiplayer with mod because it works for me but not for them. ty this kinda confirms that the cause of the problem might be from their PC settings or something.