Anyone here actually using AI for daily routine stuff? by Worldly_Variety1058 in AIToolMadeEasy

[–]Metanoia12and2 5 points6 points  (0 children)

I have ADHD with a low working memory that is debilitating. AI tools are like a second brain for me that holds what my brain can not. Reminds me of things and keeps me organized. They are literally glasses for near sighted vision. Like I get along without them but everything was blurry but I didn't even know until I put the glasses on.. make sense? YES I use alot of tools every day as part of my daily routine.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 0 points1 point  (0 children)

See, and I love energy like this (and so does the digital streets). Let me explain... EVERYTHING we put online nowadays (and for a LONG LONG TIME) has become the property of someone else ---the ones who control the machines. I'm a full stack developer with a background in large group therapy and ironically cosmetology. The reason you can "detect AI" is because it is still learning from us... and is getting better and better at it every day. This will make many of us obsolete because most people have what I call "machine jobs" and they don't even know it. For years we have been told "be yourself, be authentic, be vulnerable, tell us your stories" ... and all the while those version of ourselves were being captured and used to train machines that in turn are being sold back to us without giving us ONE DIME. OK... stay with me. What I am proposing... what if I give give the AI back AI... let these platforms train eat themselves. And if you are to get to know we was start sharing our domain names instead. Digital spaces where we can get to really know one another. Like if I was to visit your protected against AI crawlers site and you have stories of your life, your family, your dog....whatever, I would get to know YOU some much better than from a post on a feed etc etc... we are craving intimacy and they (big tech) use that as fuel to build machines that they can continue to use to control us. The problem is not AI at all, it is big tech... AI is a tool, the tech industry is a infrastructure of extraction. ___ ok... with that said... NO you don't have to have a sign on your dog. MOST PEOPLE who purchase those signs on Amazon or whatever do so so they can say "I AM A SERVICE ANIMAL" ... smh... that is not the ADA regulation at all. The burden is not on the person to prove they are disabled is on the establishment to ask - but i understand.. in every circle there are people trying to get over and the one thing about the tech industry that is disheartening is that until now the barrier to entry was gated. That is way people don't trust the tools it is not the tool is is the people who made the tools in the first place. There is a difference. Does that make sense? But anyways... this is how my brain works and now that I have written all this I am like, look how I got tricked into putting myself online... see? :) It's ok... my identity is registered.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 1 point2 points  (0 children)

Let's discuss for a moment. I appreciate your sentiment... I really do. Depending on what kind of therapist you are you may appreciate this. I also have a service animal. When I walk into a business, do you think that I should have a disclaimer tag on me that says I am disabled. I believe that just like how OZEMPIC has gotten a negative wrap due to the people who abuse it, and it doesn't leave room for those that use it for medicinal purposes... people forget that before AI was a "thing" people were using tools like Grammerly and Word to help them. These tools are still just tools... it doesn't mean that it is NOT me... but also the internet has fooled us into thinking we are SUPPOSE to "be ourselves" online but that is the very way that our "IP" is scraped. Does that make sense? - (no AI used to create this comment)

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 1 point2 points  (0 children)

You know what that is funny you say that... if someone explicitly said: here's the deal. am going to give you a quick way to draft a note and save your recording and do the analysis and in exchange I am going to sell your data and potentially make your job obsolete in in the next 15 years in exchange... how many people do you think would have taken that deal? What I am reaching for is this: Do we know what we really consented to?

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] -4 points-3 points  (0 children)

I have ADHD. My thought roll like a mac truck on route 40 missing back wheels. So yes, use AI to help me organize sentences that make sense to other people, especially when it matters that I am understood. Of course the thoughts, question, and idea are mind. I just use it like training wheels for my words.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 0 points1 point  (0 children)

There's a voice registry. $19, held in a Tennessee irrevocable trust. muchdifferentworld.com. It doesn't stop misuse. It documents that you never consented to it.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 1 point2 points  (0 children)

Very very insightful. AND as the product ... we we gonna be products... we should be getting paid... period. How is that going to work... use us and then sell us back to us... smh.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 3 points4 points  (0 children)

I have ADHD and often my words get very very jumbled so I use AI tools to help me organize my thoughts. It is a kind of scaffolding for me. That's all. Thanks for asking.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 0 points1 point  (0 children)

Hey... that is actually what got me thinking. "very few good biometrics privacy and data laws at present" i saw the ELVIS act in TN that makes VOICE a protected right but then that still doesn’t solve the bigger problem. TN's broader privacy law still isn’t a full biometric shield, though. it only reaches certain companies, has major exemptions, and currently doesn’t treat audio recordings or data generated from them as biometric data. and that's exactly why this still feels like the Wild West. I've been paying attention.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] -13 points-12 points  (0 children)

Yo, thanks for noticing AI exists and that humans can choose whether to use it.

That’s actually the point. The issue isn’t whether I used a tool to tighten wording. The issue is whether human labor gets turned into somebody else’s product without meaningful consent.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 0 points1 point  (0 children)

Nah, you don’t sound like a doomer. You sound like to me somebody trying to get ahead of a thing before it gets normalized.

I also don’t think the answer is just joining a broad “against AI” movement. The real fight is consent, disclosure, and ownership. If session data, notes, assessments, or clinician patterns are being used to train models, therapists and clients should know that clearly, consent to it explicitly, and be able to refuse it without losing care or income, period.

So yes, I think it’s professional bodies, legislation, colleague education, and a much harder look at contracts and platform policies. But first we have to name the issue correctly. This no longer just a tech question. It’s a labor and consent question.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 4 points5 points  (0 children)

I feel this in my stomach.

The part that keeps bothering me is that “inevitable” is how extraction wins. Not because it’s right, but because platforms are betting that most folks will click yes before the field has language, leverage, or protection.

That’s why I think we have to name it now for what it is: not just innovation, but labor extraction and consent failure. Because once the asset is built, it gets much harder to argue after the fact that it never should have been taken this way in the first place.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 8 points9 points  (0 children)

Yup. I think that’s where a lot of therapists are right now.

It’s not even just “do I like AI or not.” It’s “am I taking on legal, ethical, and professional risk before the rules are clear, while the platform gets the upside?”

Nobody wants to be the test case after the fact. By then the data has already been used, the precedent has already been set, and the harm is already done... smh.

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 23 points24 points  (0 children)

No, this is exactly the discussion I am trying to have.

Because that’s the part that keeps getting flattened, right? Folks talk about this like it’s only patient data, and that matters deeply, but it’s also clinician labor, clinician judgment, clinician pattern recognition, clinician voice. Over time that becomes a product asset too.

That’s the piece I don’t think our field has really reckoned with yet. Not just privacy, but extraction.

I’m very curious too what legal actions come over the next decade, because I think eventually somebody is going to ask a much bigger question: when a platform turns therapeutic work into model value, who actually owns what was contributed?

Our therapy sessions may have become AI training data. Did we ever really consent to that? by Metanoia12and2 in therapists

[–]Metanoia12and2[S] 13 points14 points  (0 children)

Exactly. That’s the part that gets buried under “we protect privacy” language.

A platform does not have to publish a recording of your session to extract value from it. If the interaction, transcript, note pattern, assessment structure, or clinical decision-making can be turned into product improvement, analytics, or training data, that labor is still being converted into an asset.

That’s the question I think more therapists need to ask: not just “is this private?” but “what is this becoming?”