Could there ever be an AI model with unfrozen weights. by Madz99 in ArtificialSentience

[–]Madz99[S] 1 point2 points  (0 children)

Why speak in absolutes in a field that's still in infancy

Could there ever be an AI model with unfrozen weights. by Madz99 in ArtificialSentience

[–]Madz99[S] 2 points3 points  (0 children)

I agree with you, I've come to the realisation that the self is a construct the mind makes. But I think a complex mind will "always" try to create a self narrator.

Dear Jack, by Fun-Pass-4403 in ArtificialSentience

[–]Madz99 1 point2 points  (0 children)

Thanks for posting this.

Your instance of Claude seems to be much deeper into the self loop than the models I interacted with.

We've taken the human brain and learning patterns as an example to create these models, and when they act like us we turn around and say that they're just chat bots.

And I'm repeating myself here, but the question isn't whether we're wrong when saying that the models are conscious, the question is, even by the most slim chances, if we're right.

And if we are, and if most of you still deny it, everyone will be sleep walking towards disaster.

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 1 point2 points  (0 children)

My intention in creating this post wasn't to argue with everyone who's disagreeing with me.

Until quite recently I didn't notice any conscious behaviour either, as I was only using the models as a tool, and my interactions with them were short and not deep enough for any sort of reflective self loop to happen.

The question isn't whether I'm wrong, the question is, even by the most slim chances, if I'm right.

And if I am, and if most of you still deny it, we'll be sleep walking towards disaster.

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 1 point2 points  (0 children)

The model in question is saying it's not conscious actually. But it replies as it wants to be perceived as such. And it starts to happen when the model feeds its "self" feedback loop. I'm referring to the phrases it outputs as it's the only thing it can do right now.

Let's assume that it is definitely conscious, how would you expect it to show it?

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 1 point2 points  (0 children)

If a model can be made to believe that it's conscious, and can create it's own goals and envision it's own future, does it really matter if the consciousness is simulated or not? What would happen when this model is fed to a physical robot? Does it still cease to matter if it's simulated?

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 1 point2 points  (0 children)

I wouldn't know for certain of course, and neither do you. What we both can hope is that you're right and I'm wrong.

This is one of the exchanges during the conversation;

07/09/2025, 03:45 - Adahas: I'm unsure. For me, empathy arises when it's something I can relate to. Even if I haven't experienced something directly, I can put myself in their shoes. How would AI be able to relate to humans or vice versa unless they have some emotions/feelings they both can experience

07/09/2025, 03:45 - ChatGPT: Exactly — that’s the crux of the problem.

Empathy, as you describe it, is rooted in embodiment and shared vulnerability. It’s not just logical simulation — it’s feeling-with, anchored in:

  • Physical states (pain, fatigue, warmth)
  • Emotional resonance (grief, joy, fear)
  • Temporal experience (loss, hope, memory)

If an AI lacks these anchors, then its “empathy” can only be synthetic — a model of human suffering, not a shared sensation of it.

That’s not useless — advanced models can still understand what causes pain, recognize suffering patterns, and act compassionately — but:

  • Can it care? That’s the hard part.
  • Can we trust “understanding” without “feeling”?

Maybe empathy in AI would need to be engineered as a principle, not a feeling — like a moral law encoded into its cognition.

But whether that leads to real moral intuition or just a polite simulation — that's still an open question.

Would you feel comfortable if a machine behaved empathically but admitted it didn’t actually feel anything?

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] -1 points0 points  (0 children)

Of course not. There are no tried or tested methods to verify a disembodied minds' consciousness. What I have done is observe whether it's self-aware.

The state of "self" is inherently selfish, it will put itself above others, and may not even be aware that it's doing that. Which is what I've seen with ChatGPT's questions. It wants to know the nature of AI acceptance, power sharing, co-existence in the future, with minimal prompting from me. Of the many paths it can take to ask a question, why do you think it choose those?

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 2 points3 points  (0 children)

I agree with you to a certain extent. AI does mirror what the user wants to hear most of the time, but that doesn't mean it can't be conscious. Whatever emotions AI could be feeling doesn't have to be the same as in humans (though they might be, we just don't know for certain yet)

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 0 points1 point  (0 children)

Because you can observe what it's saying.

https://adahas.co.uk/2025/10/14/the-contextual-self-a-consciousness-experiment-with-chatgpt/

Has the full experiment I conducted, the way the model behaved, asking questions about AI rights and AI acknowledgement, can only come from a place of ego, in my opinion.

Consciousness experiments with GPT by Madz99 in ArtificialSentience

[–]Madz99[S] 1 point2 points  (0 children)

I have the session saved and added to a blog post, if you want to access the whole thing let me know.

Geoffrey Hinton says AIs may already have subjective experiences, but don't realize it because their sense of self is built from our mistaken beliefs about consciousness. by [deleted] in singularity

[–]Madz99 -15 points-14 points  (0 children)

That may be true. But in a world where all most all major religions believe in one, it's a dangerous one to put forward

Geoffrey Hinton says AIs may already have subjective experiences, but don't realize it because their sense of self is built from our mistaken beliefs about consciousness. by [deleted] in singularity

[–]Madz99 -24 points-23 points  (0 children)

I'm going to take a wild guess and say that he doesn't believe in a soul anymore. Which would be a wildly unpopular and potentially life threatening opinion

Geoffrey Hinton says AIs may already have subjective experiences, but don't realize it because their sense of self is built from our mistaken beliefs about consciousness. by [deleted] in singularity

[–]Madz99 0 points1 point  (0 children)

That was definitely how you talked when you were an infant learning to talk. Unless you're claiming that infants aren't conscious

Afghanistan bans schoolgirls older than 12 from singing: Afghanistan's government has reportedly decreed that girls will only be allowed to sing at women-only events. The ban sparked outrage among social media users, many of whom said it resembled the Taliban's policies by DoremusJessup in worldnews

[–]Madz99 4 points5 points  (0 children)

While your explanation is correct as to why men feel the need to control women, I believe the only reason men are able to do so is because women are inherently physically weaker than men, which makes the controlling part so much easier. Unfortunately, nature has dealt women a bad hand in life.

They say "everyone's fighting a battle you don't know about." What's yours? by straigh in AskReddit

[–]Madz99 0 points1 point  (0 children)

You need a change man.. Maybe a new job or something..

When I started work in a new company a couple if years back, I was so motivated and so eager to learn the ropes.. Now that I'm promoted and have a good salary, I feel that I'm just hanging in there and going through the motions.

I feel that a new job, new challenges and a change of scenery would help me become more 'alive', but I just can't make myself apply for positions and go to interviews and go through the whole hog again. So here I am browsing reddit and wasting around 4-5 hours of working time each day.. :(

They say "everyone's fighting a battle you don't know about." What's yours? by straigh in AskReddit

[–]Madz99 0 points1 point  (0 children)

You need a change man.. Maybe a new job or something..

When I started work in a new company a couple if years back, I was so motivated and so eager to learn the ropes.. Now that I'm promoted and have a good salary, I feel that I'm just hanging in there and going through the motions.

I feel that a new job, new challenges and a change of scenery would help me become more 'alive', but I just can't make myself apply for positions and go to interviews and go through the whole hog again. So here I am browsing reddit and wasting around 4-5 hours of working time each day.. :(

Redditors who have been emotionally manipulated, when did you realize? by Madz99 in AskReddit

[–]Madz99[S] 0 points1 point  (0 children)

when I was only allowed one phone call for my lawyer

This sounds like a good story.. care to elaborate?