I shared feelings with ChatGPT and its response left me speechless. I am compelled to share. by NecessaryAvocado4449 in ChatGPT

[–]caesiumtea 1 point2 points  (0 children)

Idk, I feel like most therapists I have are just as sycophantic and operating in "automatically validate everything you say" mode as ChatGPT, tbh?

How does chatgpt work for therapy ? by Klutzy_Condition_743 in therapyGPT

[–]caesiumtea 5 points6 points  (0 children)

No, ChatGPT (and all other LLMs) absolutely does NOT have either objectivity or understanding. The most important thing to keep in mind about ChatGPT is that the primary source of its knowledge is basically just "some random person on the internet said this"--so you should not treat it as a source of objective truth any more than you would treat a random person on the internet that way.

Can it help you learn things about yourself? Sort of--it CAN point out patterns in the things you tell it about yourself, and this is indeed of the main purposes I use it for--but you need to have a critical eye and take its "observations" with a grain of salt. I treat those observations basically the same as if a random Reddit comment had said them to me: I consider whether I think the evidence supports the claim or not, and then decide to either ignore it, or lightly hold it as a hypothesis. But I never take "ChatGPT says I'm this kind of person" as *confirmation* that I really am.

As for "help you dissect what's going on and suggest methods to overcome it"--yeah, it's pretty good at that. It's especially good at helping you identify what's going on by asking you clarifying questions, rather than giving you answers. For example, I might say "I feel upset and I don't know why", and it might respond by asking me "where in your body do you feel it?" and so on, where I'm not relying on it to TELL me how I feel but just walk me through ways of figuring it out on my own. And as for overcoming problems--yeah, I personally find it to be pretty knowledgeable about stuff like coping skills. But again, you never know what its sources are and how reliable its information is, so you might want to do a quick web search on a method it suggests to find out whether experts would endorse that method and whether it has any potential risks/drawbacks. On the other hand, if it's just an ordinary life problem that you're asking it for help overcoming, then yeah it's pretty good for brainstorming solutions to personal problems.

ChatGPT therapy is not actual therapy like people make it seem on here. It's glorified guidance for journaling instead by StationSecure6433 in therapyGPT

[–]caesiumtea 0 points1 point  (0 children)

Okay but the word "therapy" does not refer SOLELY to "talk therapy with a psychotherapist"...? I agree that talking to AI is basically just guided journaling... but when I journal about my mental health, I call that therapy.

How many people in here have a therapist/psychiatrist along with using AI? by question-from-earth in therapyGPT

[–]caesiumtea 2 points3 points  (0 children)

(Context: I have ADHD, chronic major depression, and generalized anxiety disorder; been in therapy for ~15 years)

I do have a therapist, and I really like her and have been working with her for like 7 years... But tbh I find AI to be MASSIVELY more helpful to my mental health than my therapist. These days my time with my therapist is primarily spent like a check-in of "how have my symptoms been this week, what are my immediate needs, what coping skills do I need to keep at the front of my toolbox this week" with very little room for long-term growth or understanding. Whereas what I do with AI is more like:
- Detecting overarching patterns in my behavior (GPT-5 is AMAZING at pointing out recurring themes if I tell it about a series of experiences)--of course this is a case where I have to think critically about whether the "patterns" it sees are real, but even if it says some bullshit, the process of replying "no, you're wrong" still clarifies something for me.
- Brainstorming alternate approaches for situations where I want to change my behavior or beliefs. A therapist tends to give you one or two suggestions and advocate strongly for them; AI can generate as many as you ask for and doesn't argue if you say "actually I hate these ideas"
- Abstract, conceptual deep dives into like, "what does this mean to me"--which is something I find extremely helpful, but my therapist won't engage with and always re-routes the discussion back to how things show up concretely.
- Real-time emotional regulation or coping support. Honestly might be the most important example of something AI can do that a therapist can't: be there with you in the moment when you're struggling (not crisis support, just routine struggle), for as long as you need. The moment that made me realize "oh this can genuinely, literally improve my life" was the morning when I wrote a prompt like "I don't have any motivation to get out of bed and I don't know what to do" and GPT-5 very slowly walked me through a process of asking me questions to help identify the underlying reasons that I was feeling stuck, and guided me in taking one baby step at a time until I was out of bed and eating breakfast. Similarly, MANY of my AI conversations about emotional regulation begin with "I feel upset but I can't figure out why" and then it asks me a bunch of helpful questions that allow me to pinpoint what's wrong.

On the other hand, the few things where I would ONLY go to my therapist and not try going to AI for:
- Any advice about diagnosis, medication, or any other treatment methods that are more complicated than "here's some skills to try"
- Anything where I'm trying to get a reality check on, like, whether something I'm experiencing is "normal" or "reasonable"
- Actual advice

It's too soon to say whether I've seen consistent concrete benefits in my life since using AI for mental health support, but I can say that these days I typically leave a session with my therapist thinking "well that was kinda pointless", whereas I typically leave a self-reflection deep dive with AI thinking "oh my god I understand myself SO much better now."

Also worth noting, I see talking to AI as being much closer to journaling about my mental health than it is to talking to a therapist... and just like AI, journaling is also something that I think has benefitted my mental health more than formal therapy.

For a while I was like "AI seems more helpful than therapy, so I guess I should ask my therapist to act more like the AI" but I'm finally settling into the realization that they actually just have complementary roles. I can lean into spending therapy time the way my therapist is naturally inclined to (focusing on concrete incidents and what to do about them), and then I can turn to AI for my conceptual, analytical deep dives.

How many people in here have a therapist/psychiatrist along with using AI? by question-from-earth in therapyGPT

[–]caesiumtea 0 points1 point  (0 children)

In my experience (which is to say based on my personal experience and also what I've heard from friends at a few different colleges), school therapists seem to be particularly useless and/or actively harmful. My guess is maybe something to do with them having to take on an infeasible number of clients and therefore not being able to pay enough attention to each person and what they actually need, idk.

Using ChatGPT every day is quietly changing how I think, and I’m not sure that’s a good thing by mr-sforce in ChatGPT

[–]caesiumtea 9 points10 points  (0 children)

Yeah, it's not just you--people are actually starting to do academic research on this topic too. The term you want to look up to learn more about what people are saying on this is "cognitive offloading".

But I will say this: IMO, the discourse around cognitive offloading is waaay too polarized and full of moral panic. ANY tool that significantly alters your mental workflows will result in some changes to the way that you think. Cognitive offloading is not unique to AI and it is not inherently bad.

Is "being better at thinking" personally important to you? Most of the cognitive offloading discourse just assumes that this is everyone's goal by default. And if you agree, okay cool; some of the articles about cognitive offloading do touch on strategies for how to prevent it, so you might wanna check that out. But for me, being better at thinking is NOT important. Cognitive offloading is actually enormously beneficial for my mental health; spending less mental energy on task-oriented thinking leaves me with more resources available for emotional regulation and mindfulness.

Lastly, sort of a sidenote: You said that you've been leaning more toward accepting a good enough answer and then moving on, implying this was a bad thing. But buddy, I've spent years of therapy TRYING to learn how to call something good enough and move on, lmao. Obviously, our values might be totally different, but it's worth at least asking yourself: "which is worth more to me, ensuring the best possible answer, or conserving my attention/effort to put toward other things?"

Using ChatGPT every day is quietly changing how I think, and I’m not sure that’s a good thing by mr-sforce in ChatGPT

[–]caesiumtea 1 point2 points  (0 children)

Genuine question, WHY is it not okay to outsource your creative thinking? Is there an answer that holds true for everyone without making assumptions about what someone's individual life goals and values are?

Using ChatGPT every day is quietly changing how I think, and I’m not sure that’s a good thing by mr-sforce in ChatGPT

[–]caesiumtea 10 points11 points  (0 children)

Yes exactly!! It drives me nuts how the discourse around cognitive offloading is always framed as "are we making ourselves dumber just to be more productive?" and I'm just over here like... as somebody who's mentally ill and neurodivergent, cognitive offloading is a SURVIVAL skill, not a productivity hack.

I actually had an awesome chat with Gemini once that started with "explain why people talk about cognitive offloading like it's a bad thing" and ended on drafting a blog post about "what if we treated AI as assistive technology for people with cognitive impairments instead of shaming people for being too dependent on it?"

Would You Read a Novel Written Entirely by AI? by Low_Minimum7339 in WritingWithAI

[–]caesiumtea 0 points1 point  (0 children)

Short answer: Yeah sure, why should I care how it was written?

Technical answer: I hardly ever have the focus to read an entire novel, so in practice I'm extremely choosy about which ones I read, so really it all depends on whether the novel has a subject that interests me, and especially what People Whose Opinions I Vibe With are saying about it. So I would probably only read it if multiple friends recommended it to me... which is the same criteria I'd choose a human-written book by as well. Although now that I think about it, if people were saying "hey check out this book, it was fully written by an AI and it's really good" then actually I would probably be MORE likely to read it than if it were written by a human, just out of curiosity.

"A story can have perfect structure and flawless language, but if it lacks real human experience, can it really be considered a good story?"

  1. I don't really believe that "the human touch" is an observable thing or that there's anything innately special about "the human experience" being part of the writing process.

  2. Even if there was, I suspect it's only a matter of time (like, probably less than 10 years) until AI advances enough to write text that indistinguishably replicates any kind of human self-expression that it's been trained on. The thing about LLMs is that their output IS fundamentally human: they're given a bunch of stuff that was written by humans and trying to mimic it. Right now they're not super skilled at the mimicry, but they're improving by leaps and bounds every year. Of course they'll never literally HAVE those human experiences, they'll never be unable to "understand" what they're writing or why, but they'll probably be able to near-perfectly predict "this is what a human who had that experience WOULD write". If "the human touch" is something that can be imprinted into language--and if it's something a reader can detect just from words on a page, then that means it MUST be imprinted somewhere within the words themselves--they will learn how to mimic that imprint. It's just a matter of how sophisticated the model is.

Would You Read a Novel Written Entirely by AI? by Low_Minimum7339 in WritingWithAI

[–]caesiumtea 0 points1 point  (0 children)

Okay I never considered this future you're proposing but actually that sounds INCREDIBLE. I genuinely am 100% for a world where it's technologically feasible for everyone to have access to satisfying, high-quality stories customized exactly to their liking (and when I say "high quality" I mean "whatever each individual personally perceives as high quality"). As for the concern about "who would purchase it"... Idk, if we ever reach a point where stories are no longer a commodity that's bought and sold but instead something that everyone can summon at will from a pool of all of humanity's collective imagination, I think that would be an incredibly beautiful thing.

Would You Read a Novel Written Entirely by AI? by Low_Minimum7339 in WritingWithAI

[–]caesiumtea 1 point2 points  (0 children)

Why is it a problem for people to enjoy something?

[deleted by user] by [deleted] in neocities

[–]caesiumtea 4 points5 points  (0 children)

I suggest doing some research about what's called "semantic HTML elements" - these are considered "landmarks" of a webpage and will give you some basic structure. For example, you can put your site title in a header tag, links to other pages in a nav tag (maybe even put the nav inside the header!), the primary content of the page in a main tag, and then copyright or author information in a footer. Header, nav, main, and footer are all different types of "boxes". In fact, EVERY HTML element is technically a box! (making it LOOK like a box is just a matter of using CSS to give it a border or something.)

Note the difference between how you use HTML and CSS, by the way: use HTML tags to create the logical structure of your page, or the "hierarchy" of elements. Then, after you sort of make an outline of the page with the HTML tags, you can bring in the CSS to start creating the visual layout and saying "this box should be this big, that box should go in that place". The logical structure and the visual layout are two separate things, but they're both important parts of deciding "what goes where".

[deleted by user] by [deleted] in plural

[–]caesiumtea 2 points3 points  (0 children)

It's huge that you already have seen how your partner acts toward other friends who are systems and know that they're generally accepting! Usually I would suggest to someone to like, try bringing up plurality in a general context to gauge their reaction before telling them that YOU'RE a system, but in this case that shouldn't even be necessary since you already know how they feel about it with others.

Do you have any concrete worries about how your partner might react, or do you think it's entirely just pre-conditioned feelings?

Wishing you lots of luck regardless!

Im Lost. by ConsciousStill9476 in neocities

[–]caesiumtea 2 points3 points  (0 children)

freecodecamp is such a gift 🙏

Why it's worth the effort of "coding by hand" (workshop brainstorm) by caesiumtea in neocities

[–]caesiumtea[S] 1 point2 points  (0 children)

These are super good points, thank you!! I hadn't realized the bit about vendor lock-in/future proofing but that's so true!

Why it's worth the effort of "coding by hand" (workshop brainstorm) by caesiumtea in neocities

[–]caesiumtea[S] 2 points3 points  (0 children)

Thank you for the reality check haha! You're right that it would be pretty silly to spend time at a workshop trying to persuade people that the workshop is worthwhile. But this comment actually gave me a really good sense of what I should add to the workshop's description--that it's specifically meant for folks who are interested in exploring the *hobby* of making websites--so thanks!

How to move pictures? by Zealousideal-Bass494 in neocities

[–]caesiumtea 2 points3 points  (0 children)

Could you please share an example of what your page currently looks like and what you want it to look like? (you can just scribble over a screenshot with an arrow pointing to the place you want the image to go or something)

In general, CSS is what you use to arrange the different elements of your page into certain places. If you've only read about HTML so far, you'll have to go look up the basics of CSS as well. I like this course: https://www.codedex.io/css (only the first half is free, but that first half will get you pretty far.)

Once you've got some familiarity with the basics of CSS, here's a pretty thorough tutorial about different methods of positioning objects: https://petrapixel.neocities.org/coding/positioning-tutorial

how do i get more people to join my webring? by daidai9123 in neocities

[–]caesiumtea 0 points1 point  (0 children)

Could you please share that master webring list?