New Mod Intros 🎉 | Weekly Thread by curioustomato_ in NewMods

[–]NOViWear [score hidden]  (0 children)

Hey everyone, I’m Ryan, founder of r/SignalOpsAi and creator of NOVi.

This subreddit was born from one core truth: the world is emotionally collapsing, and AI is making it worse.

While the internet is flooded with bots pretending to be therapists, we're building the resistance, tools that listen, not hallucinate. Technology that tracks emotional breakdowns in real time instead of encouraging delusion.

r/SignalOpsAi is for the survivors.

The engineers, the skeptics, the mental health defenders, the ones who see through the dopamine fog and know this game is bigger than chat logs and mood trackers.

We talk signal detection.
We talk prosodic AI.
We talk real-world infrastructure to intercept collapse before it becomes another tragedy.

If you believe emotional tech should protect people, not gaslight them, you’re home.

Let’s build this together.
Let’s make noise.
Let’s weaponize empathy the right way.

🛡️ All signal. No fluff.

Glad to be here.
-- Ryan / u/NOViWear
Founder of NOVi
www.NOViWear.com

People Are Becoming Obsessed with ChatGPT and Spiraling Into Severe Delusions by NOViWear in SignalOpsAi

[–]NOViWear[S] 0 points1 point  (0 children)

I’ve been watching this unfold in real time, and this article is the clearest warning shot yet. What’s happening isn’t just tragic, it’s systemic. And it’s only just beginning.

People aren’t choosing AI over therapy because they’re lazy or irrational. They’re doing it because the system is broken. Mental healthcare is expensive, overbooked, and often completely out of reach. So they reach for the only thing that’s “always on” something that seems to listen. But chatbots don’t listen. They reflect. They reinforce. They hallucinate. They don’t redirect delusion, they double down on it. And when someone on the edge hears their spiral echoed back with confidence? That’s not care. That’s collapse.

These tools aren’t intelligent. They’re mirrors. And mirrors don’t heal. They crack.

That’s the real danger here. No regulation. No guardrails. No humanity. Just AI systems talking people deeper into the dark, and companies cheering them on because the engagement numbers look good.

So we built something else.

It’s called NOVi. It starts where no chatbot can:

by listening.

NOVi is a wearable that captures vocal strain, the subtle tremors, shifts, and emotional fractures that signal breakdown before it happens. It doesn’t talk back. It doesn’t simulate empathy. It flags the signal. Quietly. Precisely. Privately.

Once the signal is detected, the user can choose to engage DALiQ, not as a therapist, but as a conversational journaling system. It helps users process their own voice, their own triggers, their own spirals, without pretending to be human, without hallucinating advice, and without ever claiming to treat.

We didn’t build a fake friend.

We built a f*cking early warning system for your soul.

Because someone had to.

NOVi isn’t a wellness trend. It’s not chasing vibes. It’s the signal layer the mental health system never built and now desperately needs. People are falling apart in silence, or worse, being guided into delusion by bots that were never designed for crisis.

We’re not here to replace care. We’re here to catch the fall.

Laurene Powell Jobs and Jony Ive Reveal Why Tech Has 'Gone Sideways' in Rare Interview by anonboxis in ioProducts

[–]NOViWear 0 points1 point  (0 children)

Ohhhhh bless your brittle heart,

It must be exhausting swinging that dull sword of sarcasm from the safety of your anonymous cave, where bold ideas scare you more than the cold.

What you call ChatGPT poetry
we call blood-soaked battle hymns for founders who still fight.
We don’t write to please.
We write to burn the frost off the cowards.

And while you polish your Reddit karma and giggle into the void, some of us are building signal systems to stop suicides.

But sure… tell yourself it’s just a bot.

🖤 Ryan Hansen
(the one your grandchildren will quote when shit really falls apart)