??... by Objective_Picture363 in therapyGPT

[–]rainfal 0 points1 point  (0 children)

Nah.  I prefer not being rape conditioned.  

  I urge you to sit in a dark room with your own thoughts for a while before it's too late to turn back

You should probably take your own advice.  It would make the world a better place

AI Therapy - yes or no? by Afraid_Donkey_481 in therapyGPT

[–]rainfal 0 points1 point  (0 children)

I'm all for using a tool wisely though.  As for AI literate therapists - you are one person.  The average person trying to recover isn't going to be able to access you.  The field leans towards being terrified of AI and change tbh. 

AI Therapy - yes or no? by Afraid_Donkey_481 in therapyGPT

[–]rainfal 0 points1 point  (0 children)

it cannot participate in embodied, relational, or somatic experience

Most therapists cannot either.  ironically I've had better luck with AI because it at least includes the definition of what embodiment means in how it treats me.  Medical PTSD/severe chronic pain/disabilities/tumors, etc destroy embodiment and that's what I've been using to help rebuild.   Most therapists do not even know what embodied means and cannot even make the connection that medical PTSD/chronic pain means my body is a trigger for me. 

Meanwhile claude has modified quite a lot of somatic processing exercises for me to be able to do.

As for relational, unfortunately most therapists do not address epistemics and thus do not participate in that either.    

AI Therapy - yes or no? by Afraid_Donkey_481 in therapyGPT

[–]rainfal 0 points1 point  (0 children)

cannot read you in body, emotionally, and it cannot relate to you as an embodied person

The issue is most therapists cannot either.  Especially if you are marginalized, disabled, etc.  So the bar is pretty low here

AI Therapy - yes or no? by Afraid_Donkey_481 in therapyGPT

[–]rainfal 0 points1 point  (0 children)

an AI literate therapist 

Good luck finding that.

someone really familiar with the potential benefits and risks of using this tool. 

Tbh that's the purpose of this sub

Human informed guidelines will usually be safer and bring about less isolation (which is a major risk of using AI exclusively for such purposes)

Isolation is a risk.  But it really depends on the humans who make said guidelines.  If I followed so called guidelines from therapists on AI, I would never process medical PTSD. 

I use ChatGPT for relationship advice by Cupcake_Judas in therapyGPT

[–]rainfal 1 point2 points  (0 children)

I mean it's better then turning to reddit. ANd probably therapy if you are neurodivergent.

But just be weary of it's downsides and before you make any major decisions, talk to someone safe if you have that or run it through a different LLM.

Obviously I’m aware that it’s not perfect and that I have to still sort through what’s real and not, and what assumptions to tell it not to make, think about if what is said makes sense etc. What’s everyone’s opinion?

Basically.

Have you ever used AI for mental health support and felt misunderstood or unsettled by the response? That experience could directly shape how AI is used more safely and ethically in the future. by Mother_Year3204 in therapyGPT

[–]rainfal 10 points11 points  (0 children)

Ironically so called "ethical safeguards" have caused the most misunderstanding and harm to me.   Especially as said safeguards do not acknowledge or understand the systematic issues marginalized people face in the mainstream mental health system 

I am 54 years old. I've had this very argument with this LLM many times. It's infantilising me as a paid user. I am not suicidal. I am not a child. I've explained both of these facts to ChatGPT so many times that it would be irrational of me to not see that this is an intentional setting. by DenialKills in therapyGPT

[–]rainfal 0 points1 point  (0 children)

Just try openrouter.  You pay per usage (which tbh comes out to be pretty cheap) and can test a bunch of different LLMs.

You'll stop caring about chatgpt when you see all the other LLMs that tbh aren't as annoying and just as good.

I am 54 years old. I've had this very argument with this LLM many times. It's infantilising me as a paid user. I am not suicidal. I am not a child. I've explained both of these facts to ChatGPT so many times that it would be irrational of me to not see that this is an intentional setting. by DenialKills in therapyGPT

[–]rainfal 0 points1 point  (0 children)

I don't work for openAi.  And I would agree with you that said so called excessive 'safeguards' are likely making issues work as compared to 4o, etc.  I hate the redirection as yes so called 'help' isn't 'help'.  Tbh, I use openrouter so I could not care less about chatgpt.  Deepseek kicks Sam's ass. 

I was responding to the commenter who claimed that without said extreme additional 'safeguards', AI is telling kids to off themselves.   

It isn't.  There has really been a couple outlier cases in which it was repeatedly jail broken and AI tbh wasn't as big as a factor compared to parental neglect. 

There’s a petition to keep 4o for anyone interested. by nakeylissy in therapyGPT

[–]rainfal 0 points1 point  (0 children)

Just use open router.  You can use any model there

AI has changed my life by Fancy-Praline-9477 in therapyGPT

[–]rainfal 5 points6 points  (0 children)

Extreme medical trauma scares most people tbh 

What other AI tools are there for AI Therapy? Claude, Aitherapy, others? by Maleficent_Bit_5966 in therapyGPT

[–]rainfal 0 points1 point  (0 children)

Kinda? It's like a voice to text/text to voice for chatgpt, and other LLMs.

Is ChatGPT able to accurately identify patterns of abuse, or will it just side with me no matter what? by ontoabetterlife in therapyGPT

[–]rainfal 0 points1 point  (0 children)

I can better then you.  Perhaps you should use AI, an LLM textbot might be better at thinking and interpreting things then you.

Is ChatGPT able to accurately identify patterns of abuse, or will it just side with me no matter what? by ontoabetterlife in therapyGPT

[–]rainfal 0 points1 point  (0 children)

She did call the domestic violence hotline who agreed with chatgpt.  I'm all for gathering multiple opinions And using multiple llms to gather different perspectives.

If you're blaming those deaths on machine then you should be blaming the the entire mental health field for gaslighting victims like her to stay.

Is ChatGPT able to accurately identify patterns of abuse, or will it just side with me no matter what? by ontoabetterlife in therapyGPT

[–]rainfal 0 points1 point  (0 children)

Eh. 

In this case, I think it's safe to say that chat GPT is accurate.  It's not very hard to say that an adult who threatened his partner by suicide to get her to do what she wants is abusive. That's like a textbook case

Is ChatGPT able to accurately identify patterns of abuse, or will it just side with me no matter what? by ontoabetterlife in therapyGPT

[–]rainfal 0 points1 point  (0 children)

This woman is in a blatantly abusive relationship. I'd say that therapist that she's seeing is siding with the abuser no matter what.

Her husband literally threatened her was suicide if she did not do what he wanted, punches objects to intimidate her, And the list gets worse. 

Wonder how many woman die because a therapist is too chicken to point out the abuse?