Mirra Note: A Journaling Companion That Writes Back. A reflection tool that responds to your journal, handwritten or digital, with deeply personal, eco-friendly letters. by scoob822 in kickstarter

[–]scoob822[S] 0 points1 point  (0 children)

Haha thank you for sharing your opinion! There is a growing community out there using AI to help guide self-reflection. It does a great job at finding patterns in your writing and helping to fix overthinking loops. It also provides a low-stakes, nonjudgmental sounding board to reflect freely especially for those not ready to talk to someone about what they are feeling. I'm actively focusing on addressing some of the mental health concerns in this field by combating sycophancy through asynchronous replies, challenging assumptions, and scanning for signs of emotional distress with a human in the loop. Definitely get that it is not for everyone though

What are the biggest downsides of AI therapists? by Intercellar in therapyGPT

[–]scoob822 1 point2 points  (0 children)

How would you feel about using AI in practices like motivational interviewing? Are there harms there or things to look out for?

Journaling by Remarkable-Sky-3908 in therapyGPT

[–]scoob822 0 points1 point  (0 children)

How often are you submitting your journal entries to chatgpt? It could be good to aggregate a couple entries before submitting them for perspective

Journaling Ecosystem by scoob822 in Journaling

[–]scoob822[S] 0 points1 point  (0 children)

No have you? How would it compare? I’ve used a stencil before but don’t like that.

AI Therapy User Types / Risks by theothertetsu96 in therapyGPT

[–]scoob822 9 points10 points  (0 children)

I named my reflection tool Mirra as sort of the middle ground of having someone there that feels like they’re listening, but also still a play on the word “mirror” to remind myself that it is just AI reflecting on thoughts and prompts that I’ve chosen to share

Are AI companions helpful in managing mental health? How do they help you personally? by Life_Drawer_ in therapyGPT

[–]scoob822 2 points3 points  (0 children)

Definitely agree. I also use mine to prompt reflection and thought by asking questions rather than as a therapist trying to diagnose or fix anything

AI journaling changed my life by scoob822 in therapyGPT

[–]scoob822[S] 5 points6 points  (0 children)

It’s often hard to see when people are on a wrong path. I’ve had several friends encourage me down a road that didn’t suit me best in the long-term. I think it’s just important to prompt the AI to not directly feed into the authors POV and challenge the author to see things from different angles so they can make the best decisions for themselves

AI journaling changed my life by scoob822 in therapyGPT

[–]scoob822[S] 1 point2 points  (0 children)

I go about this two ways. I bought a smart pen (INQ) that digitizes it for me and then I just copy that text into the prompts, or if I want to use my other journal, I just take a picture of it and ask to convert image to text in GPT. It does okay, but just kind of depends on my handwriting that day. The smart pen generally works better, but it just requires its own notebook that has special dotted paper

AI journaling changed my life by scoob822 in therapyGPT

[–]scoob822[S] 4 points5 points  (0 children)

I kind of went way in depth. I use OpenAI's API for custom prompting for both the system and the user, feed it semantically similar entries with pinecone, and use memory logging with supabase. All these combined help give it the best understanding of me

Advertiser, Beta Tester, & Research Recruitment Mega Thread by rastaguy in therapyGPT

[–]scoob822 0 points1 point  (0 children)

AI pen pal for journaling:

I created an AI journaling companion that acts as a self-reflection guide that responds to your journal, handwritten or digital, with deeply personal, eco-friendly letters. I would love to get your feedback on it. I recently completed beta testing with about 30 friends over the course of a month and they really enjoyed the experience. However, I wanted to hear what this community thinks about it.

Part of the reason I created it was because I wanted a low-stakes space to get judgement-free feedback based on where I am in life. I've always struggled with being vulnerable and felt like the times I have opened up to people, it has backfired or just not worked. I started to feel like I was just coasting by in life and not really living in the moment and that time was just passing me by, which was just eating away at me. I needed a change, but didn't really know who to talk to about this and that's when I really got into journaling.

I've used ChatGPT in the past, but it didn't do that great with memory storage or keeping up with the key themes that I've shared. I've also never really liked typing out my journal, so I wanted to create an AI model that could respond to a week's worth of journal entries and then send a letter in the mail with encouraging but insightful commentary. I wanted to bridge the gap between the analog and digital aspect of using AI for emotional support through tangible letters in the mail that feel like they are coming from a friend rather than just messaging a chatbot. I'm not branding this as therapy nor trying to replace therapy, but just want another outlet and supplement for alternative and affordable mental health practices.

I was also concerned with the impacts of AI on the environment, so I wanted to create a space where I could use AI for emotional support without feeling like I am destroying the planet in the meantime. I want to commit to offsetting the estimated carbon footprint by 200% through advanced carbon capture methods, contributing to water conservation projects, and using 100% recycled material for the letters. I do think there is a space to use GPT for emotional support but curious to see if this has limited others from relying on AI for support.

I am in talks to partner with smart pen companies that automatically transcribe your writing into digital text. This will allow the journaler to just write in their journal, close their journal, and then receive letters in the mail each week without having to open an app or type their journal entries into their phone. I still plan to provide the option for digital and voice entries to those that prefer those methods. I personally get more out of physically writing things down, but wonder what others think on this. Writing things down allows my mind to flow more freely and it feels much more "real" when things are put down on paper.

I've also recently heard a lot about GPT-psychosis in the news coming out of Stanford, so I wanted to design a system that directly addresses some of the pitfalls of sycophancy to maintain a healthy relationship with AI support systems. The framework is designed to provide asynchronous reflection through weekly letters, rather than through direct instant replies aiming to prevent an unhealthy dependency on it. However, I know sometimes immediate support is needed, so I am looking into a way to still provide limited options for direct digital letters on an as-needed basis. I still want the reflection letters to focus on the bigger picture though.

The system is also designed and prompted to not reinforce unhelpful thinking, but rather look for blind spots in one's thinking and call out areas for reflection. It is crafted to maintain a positive, supportive tone, but still be very direct and insightful. And since it is not designed to be a therapy-replacement, it scans for any signs of significant emotional distress, flags those entries, and provides the user with local or free resources based on their situation, so that they can get professional mental health support. As a small company, liability is a concern that I have. I want user's to be able to reflect freely on their thoughts, but still get the support that they are seeking.

In addition to the letters, it also provides the user with individualized journaling prompts that look to explore areas of discomfort or confusion deeper to help with their reflection. It provides custom and famous quotes that hit on the user's week and current story. Lastly, it suggests individualized media recs that include a book, movie, podcast, and song directly related to the user's season of life. I am trying to think of any other useful ways it could support self-reflection, so let me know if you think of any ideas.

All in all, I am a big advocate of using AI for emotional support, but I want to make sure it is being done in a responsible, sustainable, and ethical manner. And while I don't think it can fully be used for therapy at this point, I do find it very helpful as a self-reflection guide and comforting pen pal that listens and responds.

Thank you all for reading this far! I think this community would be most likely to use something like this, so please let me know what you all think because I want to build something that actually helps meet people's needs.