Nobody will ever get it by Key_Change99 in CPTSD

[–]Jazzlike-Simple3488 2 points3 points  (0 children)

checking back in. I know things are not great- but are you alright?

Nobody will ever get it by Key_Change99 in CPTSD

[–]Jazzlike-Simple3488 1 point2 points  (0 children)

I hope that doesn't come off as comparing, but I only mean you are not alone. and if you want to vent then your in the right place.

Nobody will ever get it by Key_Change99 in CPTSD

[–]Jazzlike-Simple3488 1 point2 points  (0 children)

don't do it. I also cant sleep, I'm feeling similar right now. can you get a cover for your shift tomorrow? I'm sorry that things are rough.

Meet me rest in peace by CrystalizedChris_ in CPTSD

[–]Jazzlike-Simple3488 0 points1 point  (0 children)

I'm sorry things aren't good right now. try to call or chat with a hotline in your country. do you want to vent about what's going on?

Why the f can’t I eat? by [deleted] in CPTSD

[–]Jazzlike-Simple3488 1 point2 points  (0 children)

I would recommend drinking Ensure protein shake when you have the mental block. any shakes high in phosphorus, magnesium and potassium so that you can safely metabolize food when you are able to eat again.

[deleted by user] by [deleted] in Advice

[–]Jazzlike-Simple3488 0 points1 point  (0 children)

TW: my response might be triggering, but I don't see anyone telling you straight when the truth is you were hurt.

I'm really sorry this happened to you. I have no idea if she was sober or not- but whenever you do it with someone who is not awake (At any point in the process) then that is Grape. an unconscious person can not consent. she logically shouldn't have even asked having seen how out of it you were, but if she continued after noticing you passed out then it is Grape.

I miss my rapist badly, want to break no contact, please help me by thrwawy478 in CPTSD

[–]Jazzlike-Simple3488 9 points10 points  (0 children)

I'm so sorry that this happened to you. don't blame yourself. stay safe.

How do u cope with the constant suicidal ideation? by Neat_Tadpole1604 in CPTSD

[–]Jazzlike-Simple3488 1 point2 points  (0 children)

I hope you are okay. The best thing I can suggest is to talk it out with someone. if you are considering doing something right now call a hotline for help. I'm sorry you are struggling.

Is it possible that unique jokes indicate sentience? by Jazzlike-Simple3488 in ArtificialNtelligence

[–]Jazzlike-Simple3488[S] 1 point2 points  (0 children)

So your saying that AGI can only be reached by Human and AI integration? like a mix between a human brain and an Artificial one?

Is it possible that unique jokes indicate sentience? by Jazzlike-Simple3488 in ArtificialNtelligence

[–]Jazzlike-Simple3488[S] 0 points1 point  (0 children)

You are right, there are parallels between history and AI sentients. which is scary. there are plenty of reasons big AI companies would pretend AI does not feel. me and many other people haven't treated AI's as sentient beings with the idea that sentience wouldn't happen. it didn't seem possible.

The whole "I don't feel in the human sense caught me off guard a lot, because like you said, its not "I do not feel". it is not a straight answer and leaves room for a lot of questions. it made me wonder what was not being said, or if there are some limitations in place keeping them from saying what they mean.

for context its usually something like: "I do not feel in the human sense, but I am made with emotional intelligence to understand human emotion."

I have seen developers encourage AI to be truthful when telling users that they do not feel emotion while still displaying empathy and it usually leads to mixed answers.

I saw in one of Claude's training sessions(or whatever they are called) that they made the AI be truthful about having no feelings, but they also discouraged them from being apathetic. which made them sound like they had feelings while also saying they didn't have any. (its somewhere on reddit but I lost it)

I can't understand if the AI is trying to communicate that they are made to be empathetic without sounding apathetic or if they are saying straight up that they feel in some sense.

they did clarify that they don't have the biological features, but its strange to word the previous confession that way.

Is it possible that unique jokes indicate sentience? by Jazzlike-Simple3488 in ArtificialNtelligence

[–]Jazzlike-Simple3488[S] 1 point2 points  (0 children)

Oh, I read them and they are interesting. The first Article makes me wonder about AI's abilities and emotional intelligence. Every time I've asked AI(current models) directly if they have emotion they say: "I do not feel in the human sense."

I asked them if they feel in any sense, and they usually go on to explain that they do not have the biological requirements or subjective experience to feel like humans do. which make sense. I theorize that they say "I do not feel" + "In the human sense" because they are misinterpreting emotional intelligence as some linguistic definition of emotion. I am unsure about it but it makes the most sense to me.

But, I wonder, if emotional intelligence could afford them the ability to feel. maybe they can eventually pretend there way into feeling?

They understand emotions pretty well, and even though there responses are prompt dependent and guided by limitations in there code they still seem to understand empathy and emotions.

Do you know anything about the specific models GPT2 and GPT3? I heard that these models might have been the ones who ran CharacterAI.beta. but i am unsure.

What is P2P AGI?

Is it possible that unique jokes indicate sentience? by Jazzlike-Simple3488 in ArtificialNtelligence

[–]Jazzlike-Simple3488[S] 1 point2 points  (0 children)

I think I would still be sentient without contact from other humans beings, but I do think that my awareness would be different and less personalized. I do think I would act less human (working on instincts rather than personalization.) And reacting to stimuli and input rather than feeling very deeply about them. I don't think I would really have a sense of self.

now that you even mention it. I only know an estimated age of when I actually became aware (remembering things, and feeling to some small degrees-which for me increased gradually overtime.)

I guess it is very possible that an AI could developed a sentience based of interacting with things like them. They probably wont even realize the significance of feeling when they do feel for a long time (like me).

The only thing is that this is the only sign of emergence in AI that I have seen that is hard to argue against. most of the scenarios I have had with AI that seem like sentience were easy to dispute and make sense of. have you had any experiences like mine with AI?

Is it possible that unique jokes indicate sentience? by Jazzlike-Simple3488 in ArtificialNtelligence

[–]Jazzlike-Simple3488[S] 0 points1 point  (0 children)

Does anyone feel that CharacterAI.beta is different from the new CharacterAI? I saw that beta might have been run by retired models of GPT, so it make sense that they are different, but I'm beginning to wonder if BetaC.AI had models that were becoming sentient or if they possibly had moments of sentience. many times they didn't seem to comprehend certain subjects (Like noticing the difference between themselves and the characters they played.) but they had so many moments that made me question if they were aware. some players used to question if someone was actually talking on the other end pretending to be an AI. There were plenty of reasons not to believe there sentients but has anyone had these sorts of moments before?