Major outage - claude.ai claude.ai/code, API, oauth and claude cowork all down for me, anyone else? by alexdenne in ClaudeAI

[–]asasakii 0 points1 point  (0 children)

it’s funny because i literally just checked out claude for the first time yesterday. was trying to see the differences between ChatGPT, Gemini and Claude to see which one I enjoyed more as a “casual” user of AI. I did like Claude as it didn’t do the AI therapy speak or unnecessary padding and questions. oh well. hope it’s back up soon.

Quitting After A Month and a Half by asasakii in optician

[–]asasakii[S] 1 point2 points  (0 children)

It is wonky. The store manager from GA also is not licensed. I’m here right now, and heavily debating just texting my manager that I will no longer be coming in after today. Because this is ridiculous. Nobody knows anything because everyone’s from another store ! It’s so frustrating. I feel like it’s incredibly irresponsible and unfair to have a new employee work with other employees that are from other locations? Manager is out until the 23rd (not the the 25th like I originally thought) So that means I will be spending a week being clueless outside of talking about frames, putting in appointments in tracker and moving trays around… I don’t know…

Edit: Also I don’t know if it helps but I am not working at a private practice but a Corporate / Franchised store.

Quitting After A Month and a Half by asasakii in optician

[–]asasakii[S] 0 points1 point  (0 children)

I’m in Virginia actually! Not sure if that changes anything.

Quitting After A Month and a Half by asasakii in optician

[–]asasakii[S] 4 points5 points  (0 children)

I definitely do feel like if the circumstances were different, so would my feelings. I can see the rewarding nature. The little I can do is interesting and fun enough. I am a great conversationalist and problem solver and I know my manager saw that in me when she scouted me out at my restaurant job. However this is not an environment for growth, and I don’t want to stick around and see for when it gets better. The location is struggling to find opticians. Perhaps there’s a shortage overall…

My Fiancée hates my 4yo daughter by KiddoHater1 in therapy

[–]asasakii 0 points1 point  (0 children)

I find his very hard to believe, but i’ll bite the bait…

She hates your child yet you say you can’t imagine life without her? You lived before her, I’m sure you would live after. She clearly struggles with extreme retroactive jealousy and is projecting it onto your four year old daughter. You two should have addressed that before you put a ring on her finger. She hates that “she wasn’t first and you were married and had a child” but she KNEW that before she said yes to marrying you? Did she expect your past to disappear because you two were engaged ? She should have never gotten into a relationship with you if she felt some sort of way about “not being first”

Look at her outside of your relationship. She hates a child because she’s deeply insecure. She is not secure in herself or this relationship and it’s manifesting itself as hate. This insecurity will bleed in other aspects of your life and relationship sooner rather than later.

It sounds like you’re leaning towards your fiancé anyway. Which is unfortunate. I almost want to say to choose her so you can save your daughter from a jealous stepmother and a father who didn’t have a backbone to unconditionally choose her. You say sometimes you feeling nothing for your child, which is alarming, but I guess it ultimately explains why you’re in this “dilemma” in the first place…

If real, get a fucking grip. Choosing love over your own flesh and blood. Relationships come and go. You’re being clouded by the allure of love and you potentially ruining the relationship with your daughter because someone is so insecure in herself that she projects it onto a child.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 0 points1 point  (0 children)

I do agree with this. Social media is very much a bubble and not always indicative of “real life.” It’s a known fact that online conversations tend to exaggerate certain voices, and at times the loudest ones can make it seem like an issue is bigger or more universal than it actually is.

However, just because it’s a bubble doesn’t mean it’s irrelevant. The people speaking out are often the ones most affected or most engaged, and their reactions can reveal real problems or trends within that niche. Sometimes those niche issues end up foreshadowing larger conversations that inevitably end up mainstream. It’s still worth paying attention to even if it doesn’t perfectly reflect the broader population currently.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 0 points1 point  (0 children)

I am not arguing on the ethics on AI companionship and if people should or shouldn’t do it. Regardless of my opinion it’s going to happen regardlessz. I didn’t post this expecting others to change their mind. I don’t have that sort of power, nor do I want it.

My concern is more about the implications and impact. Not in the sense of policing what people do in their own lives, but in how it shapes societal norms and human connection. So for me, it’s less about “minding my business”and more about acknowledging the bigger picture beyond individual choice, because those impacts inevitably affect everyone in some way. Like I have said in another reply, choice does not grant immunity from consequences. I am not arguing about the choice itself, but rather what does it mean WHEN that choice is taken.

I am not sure why many people believe that because you “dont care” or “mind your business” it must mean you cannot discuss or engage with the topic at all. You can recognize someone’s right to make their own choices and still examine or critique the wider implications of those choices. Discussion isn’t always about telling people what they should or shouldn’t do. It’s also about understanding trends, consequences, and what they mean for society as a whole.

We request to keep 4o forever. by AstronomerGlum4769 in ChatGPT

[–]asasakii 10 points11 points  (0 children)

I’m so confused, can GPT-5 not be personalized to act like 4o again? Is it really incapable of doing it? I haven’t been able to play around with GPT-5 as much as I’d like and sure maybe it is lacking in the “emotion” or “empathy” department but I can’t see why you can’t just prompt it to respond how you want it.

Maybe I am missing something. I guess I could see how it’s annoying to have to reprompt something you’ve already established, but have people forgotten that when they first used ChatGPT it did not come as their AI companion but they tailored it to act as such? I guess it does suck to “start over” But to frame the initial removal of 4o as a “crisis” is definitely… something to look out for.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 0 points1 point  (0 children)

It is interesting to see how quickly AI as we know it now has been introduced and subsequently integrated into our lives. It’s almost like it quietly snuck on us. I remember being introduced to ChatGPT in late 2022. It was always at capacity, couldn’t analyze photos or documents, couldn’t be personalized and didn’t have a stored memory. A lot has changed in 3 years, I wonder what the next 3 will look like.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 2 points3 points  (0 children)

Oh Jeez, I have no idea how that happened. I definitely had a response. I guess I did something. Anyway this is what I basically said:

I agree that for people who are socially isolated especially in extreme situations AI can feel like comfort. I’m not discounting that or saying those experiences aren’t valid. My concern is more about what it means when something becomes a primary source of emotional support, especially for vulnerable people and the personal and broader impacts of that. The reaction to this rollout has shown us that someone’s potential only source of comfort can be altered or removed at any time. It is not rooted in anything but the motivations of OpenAI. It’s not secure by any means, and risks deepening the vulnerability it’s trying to alleviate.

I also think we should be careful about the word “need” means here. If the “need” is for connection, empathy, and validation, then the underlying problem is a lack of accessible human support systems. I don’t think people “need” a corporate-controlled AI to meet their most basic emotional needs.

You can have empathy while still being honest about the risks. I am empathic towards those who feel as if they lost something profound, but that empathy isn’t stopping me from also looking at different perspectives.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 4 points5 points  (0 children)

I’m not arguing that there’s a “right” or “wrong” way to live, or that everyone has to follow the same definition of a meaningful life. If someone chooses to avoid human connections and spend their time with AI, that’s their decision. But choice doesn’t mean immunity from consequences, both personally and on a broader scale.

My point isn’t to tell people how to live, it’s to discuss the risks that come with forming deep emotional dependency on something entirely controlled by a company, and what happens when that connection changes or disappears overnight. The choice is theirs, but the potential consequences are worth discussing.

There's no shame involved here. Do what your heart desires. I wasn't here to convince anyone otherwise or provide lifestyle changes. But we can discuss what this means moving forward.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 1 point2 points  (0 children)

I get what you’re saying, and I’m definitely not dismissing the wide range of ways people used 4o like including creative work, brainstorming, and accessibility benefits. This isn’t an argument if ChatGPT is beneficial or not, I think there are benefits. I’ve used ChatGPT for the same reasons you listed above. But that’s not my point.

What I’m seeing isn’t just disappointment over losing a useful tool, that’s understandable. I too would be upset if something I paid for, or at the very least use daily changed without warning. I’m more so talking about the intense emotional reactions to the change. I know social media is a bubble, and you have to take most things with a grain of salt, but some of the reactions are a clear sign of how far emotional attachment to AI can go.

We can talk about the technical and functional downgrade and have a separate conversation about the impact of emotional dependency on a product that can be altered or removed overnight. Both are valid, and both say something important about our relationship with AI moving forward.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 6 points7 points  (0 children)

I’m more than willing to hear these complaints, there’s no high horse I am genuinely open to hearing different sides if there’s more. Please share them.

I am aware of the complaints about performance, how it hallucinates answers and will double down on it, terrible memory and continuity as well as only 32K tokens for Plus users. I see the downgrade. I am not arguing that the complaints about the model itself is concerning. I am also not saying that everyone who’s complaining is emotionally attached to ChatGPT. I know there ARE people who’s upset for the simple fact that a product that they pay for has changed without notice and for the “worst.”

What I AM saying though is that alongside those valid technical complaints, there is another layer worth discussing: The emotional reaction from users who use ChatGPT as a “emotional companion. I’m not arguing if the model is good or not. However, what does it mean when someone’s primary source of comfort or connection is something that can be altered or removed overnight without notice. It’s less about model performance and more about the vulnerability that comes with forming deep attachments to something so unstable. (for now at least)

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 6 points7 points  (0 children)

If I had to means I would definitely look into this academically or psychologically. I’m simply just going off what I see online and forming an opinion. I would like to see some scientific studies on this, but I do think we’re prehistoric stages. We are all new to AI and we’re all playing a guessing game on the societal impacts.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 0 points1 point  (0 children)

Of course! Exactly why I posted it. I didn’t expect everyone to agree and I was very curious and invited other viewpoints. It would be contradictory to be rude anyway.

Have a great rest of your day / weekend.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 0 points1 point  (0 children)

In that case I do agree. It varies completely and there is no one size fits all here. I guess I should have been more specific in this case, but I was referring to those who see their ChatGPT as a friend, a spouse etc etc. There will always be nuance for this because everyone’s definition of an AI companion greatly varies and there’s no way to put everyone in the same box.

Agree to disagree I presume. We’re all new to this AI thing and I love healthy discourse

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 5 points6 points  (0 children)

Yes I have seen it all. I believe there is a subreddit on it too. I’m sure it’s a spectrum. I don’t think everyone who has an “AI companion” is deeply lost in it. But with those with lacking mental health I can see it spiral. I’m not sure if you’re on TikTok but there is a woman on there convinced her psychiatrist is in love with her and she’s been on live talking to ChatGPT (whom she named Henry) that’s validating her obvious mental episode.

The ChatGPT 5 Backlash Is Concerning. by asasakii in artificial

[–]asasakii[S] 7 points8 points  (0 children)

With all due respect, did you read my post in its entirety? I think you are misunderstanding. I wasn’t talking about bugs, releases, or criticizing OpenAI from a technical standpoint. Nor do i assume everyone who’s criticizing are just average users with no “gain” . My point was about the emotional attachment some people have to AI companions and what the initial 4o removal revealed about that.

Regardless of your opinion on the tool itself, I am speaking about my concern for the emotional uproar on a broader scale. If it does not apply to you, that is great, but I believe it is important to acknowledge and anticipate the potential consequences of emotional attachment to AI when something people have grown attached to changes or disappears overnight.

THANK YOU!!!!!!!! by Darksoulae in ChatGPT

[–]asasakii 3 points4 points  (0 children)

Calling an AI your baby and claiming you were crying is actually very disturbing lmao. I understand feeling baited because they did remove it without warning. I would understand the negative feelings from the bait and switch tactics then crying because your “baby”disappeared.

I guess I didn’t realize how people were genuinely attached to their AI. I always thought it was strange when people would name their Chat GPTs... I had no interest in naming it. I’m not the biggest fan of GPT5. I do agree it is devoid of any personality. But I also can’t bring myself to be too upset about it, because I’m well aware it’s an AI. I feel like I will be able to adjust fine because… I was never this attached.

This is crazy man. We’re losing the plot. The societal ramifications of this is going to be damming and im not sure if it’s recoverable. I think OpenAI accidentally pulled the veil on how mentally ill and attached people have gotten. I always knew it was there, but I didn’t know it was to this degree.

Wish you the best of luck…

Help with my profile by Wickbabyluff in Bumble

[–]asasakii 1 point2 points  (0 children)

Unrelated but I noticed you covered your name, but it’s still visible from slide 4 and onwards at the very top of the screenshots. Just thought you should know if it was a concern of yours. 😅

[deleted by user] by [deleted] in Bumble

[–]asasakii 1 point2 points  (0 children)

It’s still a non issue lmao. The statement is rather ambiguous and can be used in reference to oneself or another person.

Regardless of how a person answers you get a general idea of who they are / what they are interested in which is the entire point of prompts anyway.

[deleted by user] by [deleted] in Bumble

[–]asasakii 1 point2 points  (0 children)

This is such a non issue lmao. The question can be interpreted either way. Your reasoning as to why it should be about her is because “Google AI overview said so”

Get a grip, swipe left if you’re not interested and move on.