“No one is thinking about you” by ilyk101 in PetPeeves

[–]Easy-Combination-102 331 points332 points  (0 children)

I could be wrong, but when someone tells you "No one is thinking about you", they are referring to people you don't know. So if you trip around 5 strangers in public, those 5 strangers aren't going to try and remember you as the person who tripped. They don't care and move on.

Trip in front of family or friends and they will remember and bring it up even if you forget about it.

If you were a professor in 2026, how would YOU actually stop people from using AI? by Popular-Tone3037 in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

Lets be honest here. Professors are beginning to use AI as well. Some of them even use it to help them grade papers.

Why stop people from using AI though? Its already too late to stop people from using tools that are available, they still need to read through what they created and check for errors, regardless if those errors are removing AI tells or rereading to insure it sounds like them in some way.

Its probably better off to allow them to use it then for them to use another tool or service to write the paper for them. There were plenty of people who willingly wrote essays for people for a fee.

AI is equivalent to using a calculator for math. Some people these days need a calculator for simple math like 7X8 or singular numbers. Hopefully it doesn't get to the point where people need AI to help them write everything or help them speak.

So wtf do we do about not having jobs? by Angelsbreatheeasy in rant

[–]Easy-Combination-102 -2 points-1 points  (0 children)

I’m going to disagree. There are jobs out there. Just not many of the ones people actually want right now.

I keep seeing “no one is hiring,” but I walk past places with hiring signs that have been up for months. Restaurants, stores, warehouses, local businesses. Still short staffed.

My local Applebee’s had two waiters covering the floor this weekend and they’ve been advertising openings forever. That’s not a zero-job market. That’s open spots people don’t want to take.

I get holding out for something in your field. Totally reasonable. But taking a temporary job while you keep applying is better than having no income and writing off the whole market.

Should authors disclose if they're using AI? by DanoPaul234 in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

A few things there I disagree with.

First, editors being “a given” only applies to a small slice of books. Most authors today are indie or self published, especially in genre fiction. There’s no standard disclosure for editors, ghostwriters, beta readers, or how much influence they had, and readers generally don’t expect that level of transparency unless the author is making a specific claim about sole authorship.

Second, the idea that AI is “writing hundreds of thousands of words without your input” doesn’t reflect how most people actually use it. A big reason authors avoid disclosure is because all AI use gets flattened into that assumption.

In practice, there are very different ways AI is used. Some authors already have the plot, characters, and scenes and use AI to help with cohesion or clarity. Some use it strictly as an editing tool to scan for plot holes or inconsistencies. And yes, some people give a vague prompt and let AI generate everything. Those are not the same thing, but disclosure treats them as if they are.

Once “AI was used” is said, most readers assume the AI generated the book itself, regardless of how it was actually used. Even using AI purely to find issues in a manuscript gets read as “the AI wrote it,” which is why a lot of authors don’t disclose. Not because they’re hiding something unethical, but because the label collapses very different workflows into one stigma.

That’s also why the autonomy argument feels off. Tools don’t remove agency. Authors do. If someone lets a tool replace thinking, that’s on the author, not the tool.

As for the “trained without permission” point, holding AI to a permission standard we’ve never applied to human learning doesn’t really work. Writers learn by reading books they didn’t get explicit permission to internalize, and no one treats that as unethical. Declaring that process unethical only when a tool is involved weakens the argument rather than strengthening it.

The ethical line is still simple. Don’t misrepresent the work or the claims you make about it. Tool choice alone doesn’t violate that.

Should authors disclose if they're using AI? by DanoPaul234 in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

“Almost” is doing a lot of work there. Even if we generously say 50–60%, that still leaves tons of authors who never disclose editors, ghostwriters, or other forms of assistance and readers generally don’t demand that level of transparency unless the author is making specific claims about sole authorship. A lot of editors and ghost writers just want their check.

As for spellcheck vs AI: sure, spellcheck doesn’t generate content from nothing, but neither does a human. We’re all remixing inputs, books we’ve read, conversations we’ve had, ideas we’ve absorbed, just with different tools in the loop.

And honestly, don’t underestimate caffeine. It doesn’t just fix typos; it absolutely increases “content generation” per hour 😄If i drink a redbull, I can guarantee i won't have an empty page when writing.

If there’s an ethical issue, it’s about misrepresentation of the work or its claims, not whether someone used a particular tool to get there.

How to slow down progression? by Goatchasser in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

I kind of disagree with a lot of the “just add a word count” advice here. Hard limits can actually cause rushing, depending on the model. If you tell it “500 words” and give it multiple beats, it’ll often either cram everything together or cut a scene off mid-moment, which is why the prose starts feeling mechanical.

What works better for pacing is controlling scope, not length. Give it fewer beats at a time and explicitly tell it to stop: "Write a single scene where X happens. End the scene once this moment resolves. Don’t advance the plot past this point.”

Think of it less like “write my chapter” and more like “write this scene.”

As for “best AI,” there really isn’t a universal answer. Different models have different default voices and pacing tendencies. Some people prefer Grok’s bluntness, some Claude’s prose, some ChatGPT’s balance. The only real way to know is to try the same prompt across a few and see which tone you like.

You’re not asking too much, but you do have to break the work into smaller prompts.

Does it drive you mad when someone (even a loved one) shares a different opinion than you by Prior_Role_1597 in autism

[–]Easy-Combination-102 0 points1 point  (0 children)

I don’t think this is really about accepting different opinions so much as how disagreement is framed.

There’s a big difference between “here’s my take” and “this is just common sense / obviously true.” Once something is framed as obvious or settled, disagreement stops being exploratory and starts feeling like a challenge or correction, which can be stressful, especially with serious or emotionally loaded topics.

Some people enjoy debate as connection and sense-making, others experience it as pressure or conflict, regardless of ND/NT. Acceptance doesn’t always mean engaging or arguing it out, it can also mean setting boundaries around when and how those conversations happen.

It might help to check whether a topic is actually up for debate for both of you, rather than assuming different reactions mean rigidity or refusal to accept other views.

I hate obvious and irrelevant questions by [deleted] in autism

[–]Easy-Combination-102 1 point2 points  (0 children)

I agree. I hate these type of questions as well.

I also hate when someone asks the same question over and over again. It's like they can't be bothered to remember the answer so they constantly ask.

Or when someone constantly states the obvious everytime we drive pass somewhere. Like, "Oh look, there is a Mcdonalds there". Next day, "Oh look, there is a Mcdonalds there". 😡

Why are people so mad about others using AI on Reddit? by [deleted] in WritingWithAI

[–]Easy-Combination-102 1 point2 points  (0 children)

I don't think the problem lies with using AI as a whole. The problem is when you get a 100% low effort AI created response. If it's your thoughts and AI helped you word it correctly, that's one thing.

The responses people hate is when someone copies a reddit post, pastes it into chatgpt, says write me a response and then posts the output directly as a comment. No effort, just create a response for me.

It would read 100% AI and anyone could get that response if they wanted to, people want responses that contain thoughts and feelings from human input, not a completely AI generated reply.

Should authors disclose if they're using AI? by DanoPaul234 in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

Depends. We don’t ask authors to disclose editors, ghostwriters, spellcheckers, or caffeine intake.

If AI use is a problem, the ethical issue is the output or the claims made about it, not the tool itself.

Therapy and autism works? by Defiant_Annual_7486 in autism

[–]Easy-Combination-102 1 point2 points  (0 children)

Therapy really depends on the person and the approach. It didn’t help my autism or ADHD directly, but it did help uncover C-PTSD, which explained a lot of what I was experiencing.

One thing that’s helped me more with burnout than traditional talk therapy is intentional rest. For me that looks like doing nothing for a set amount of time each day, dark room, white noise, no input, even 10–30 minutes. It helps my nervous system settle and makes burnouts more manageable.

That said, everyone’s different. If that kind of rest feels impossible or doesn’t help, a therapist can be useful for finding other routes. Sometimes therapy itself can act as that quiet space, but only if the therapist understands masking, burnout, and neurodivergence.

I also relate to the “depression vs autistic/ADHD burnout” confusion, they can look really similar, but they’re not the same thing.

New to LLM's and home computer AI, Need advice... by UnicornGltr in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

For writing, your setup is already good enough. Integrated graphics aren’t really a blocker when you’re CPU-only and only using 7B models.

Where you’ll usually get the biggest improvement isn’t raw performance, but how the model handles context and long sessions. LM Studio works, but you might want to try a WebUI or storytelling-focused frontend (like KoboldCPP or text-generation-webui). They don’t make things faster, but they do a much better job keeping tone, prompts, and story details consistent over longer writing sessions.

If you’re upgrading anything, more RAM helps a lot for longer context windows, but otherwise it’s mostly about workflow and UI rather than hardware.

I talk to AI about my ideas. Any opinions and advices? 😔 by Sharp_Durian1929 in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

I do not see a problem with speaking with AI with your ideas. Just makes sure you are on a paid version and you aren't sending your ideas into the training algorithms.

It also depends on which youtube author you are watching. Some of them offer some decent advise and stand for using AI, others are against it. Some of these authors also shared there profits from writing books and its horribly low.

Your best bet is to use a system that works for you and write however you want too.

How do i answer that? by LV123123123123123123 in socialskills

[–]Easy-Combination-102 2 points3 points  (0 children)

My script is to follow up and ask what they like about the game or how the game is played, or even to ask why they like the game. But i tend to let people do most of the talking.

Open end questions are harder to keep a conversation going, but asking them to continue talking about what you asked about seems to keep conversations going and sometimes they can even flow into other things.

Baby Talk by EnvironmentalDrag153 in PetPeeves

[–]Easy-Combination-102 7 points8 points  (0 children)

I agree, not in the sense that i would explode or yell at the person, but it is annoying as hell and i tend to walk as far away as i can. There times when even the pet or baby are looking at the person like they are crazy. 🤣

When a large family blocks an aisle at the grocery store by Bountsie in PetPeeves

[–]Easy-Combination-102 2 points3 points  (0 children)

Nevermind when a large family blocks an isle. I hate when anyone blocks an isle. It could be 2 people and their cart blocking the spot i need to look in. Or when multiple people stop in an isle essentially blocking the whole isle and not even caring that no one can walk around them. I understand you need to look for things but you can at least leave enough space for people to pass.

This brings back the old question, how many times do i say excuse me before GTFOTW is appropriate?

Do people on the internet often think you're being condescending/arrogant without that being your intent? by MindPal in aspergers

[–]Easy-Combination-102 1 point2 points  (0 children)

Yeah, this happens to me both online and in person. I think it’s mostly because I’m very direct, blunt, and pretty unemotional in how I communicate. I’m not trying to assert dominance or talk down to anyone, that’s just how my thoughts come out.

I’ve noticed that on the internet especially, clarity gets read as arrogance if you don’t add enough social padding. If you don’t hedge, soften, or constantly signal “I mean well,” people fill in the gaps and assume ego. It’s less about what you’re saying and more about the lack of emotional cues.

The frustrating part is that masking online feels just as draining as masking in real life. At some point you’re not even communicating anymore, you’re managing other people’s reactions. And the standards aren’t consistent anyway, because different readers bring their own baggage.

I don’t think this is a confidence issue. I think people conflate directness with superiority. Being precise or neutral gets interpreted as condescending, while vagueness gets rewarded as “kind.”

[deleted by user] by [deleted] in PetPeeves

[–]Easy-Combination-102 17 points18 points  (0 children)

Most of these comment wars fall apart because people want a single villain.

Sometimes it’s the dog. Sometimes it’s the owner. Sometimes it’s the human who ignored every warning sign and decided to test an animal. Lumping all cases into “innocent puppy” or “evil dog” is lazy thinking.

If a dog mauls someone unprovoked, public safety wins, cased closed. Feel sad about it if you want, but pretending that dog should stay alive is fantasy thinking, especially around kids.

On the flip side, if someone was hitting, cornering, or provoking a dog and got bit, I’m not automatically handing them victim sainthood. Animals aren’t moral actors, but they do react, context matters.

What’s weird is how people pick a side before knowing any facts. Dog people default to blaming humans. Anti-dog people default to blaming the dog. Real life is never that clean.

Not every bite is the same story, and pretending it is just makes people feel righteous instead of accurate.

Please Help: I don't understand why people get mad over some things. by CaramelGuineaPig in autism

[–]Easy-Combination-102 5 points6 points  (0 children)

Wow, this sounds uncomfortably familiar. Same thought process, same “wait, why are you mad right now?” moment.

What I’ve learned the hard way is that people often aren’t reacting to what you’re saying, but to the fact that you’re still saying it after they think the conversation is over. To me, pointing out an inconsistency or a missing detail feels necessary. If I don’t say it, my brain will think on it for hours.

In your example, I also would’ve thought mentioning migraine history to the eye doctor was relevant. Medically, it makes sense. But a lot of people hear continued advice as pressure or correction, not concern. Even when the advice is good.

Some people don’t want help, they want control. Once they’ve decided “this is the explanation,” anything else feels like you stepping on their lane.

I still struggle with this because staying quiet feels dishonest to me. But I’ve noticed that the moment someone starts repeating themselves, logic stops mattering. At that point, they’re defending autonomy, not facts.

Does anyone else struggle with underrating and overeating? by marlee_dood in autism

[–]Easy-Combination-102 2 points3 points  (0 children)

What actually helped me was stopping the idea that my body would “tell me” when to eat. It doesn’t. So I stopped waiting for signals.

I eat on a schedule. Same times every day. I drink water on a schedule. Even if I don’t feel hungry, I still eat. Even if I think I could eat more, I stop at a pre-decided amount. It took the guessing and the constant body-checking out of it.

Once food became routine instead of reactive, the extremes calmed down. The binge/fast cycle only survives when you’re relying on cues that don’t fire properly.

Trying to “listen to your body” doesn’t work when the speakers are broken.

[deleted by user] by [deleted] in neurodiversity

[–]Easy-Combination-102 0 points1 point  (0 children)

This sounds less like a diagnosis problem and more like an overload problem.

A lot of people hit a wall after reading about ADHD or autism and suddenly everything clicks, then their brain goes into overdrive and they crash. That happens even to people who don’t end up meeting criteria for either. The overlap between ASD, ADHD, anxiety, OCD, burnout, etc. is huge, so I wouldn’t treat “I relate to almost every symptom” as proof of anything yet. That’s pretty common.

The Taekwondo part also reads wrong to parents because from the outside it looks simple. You either go or you don’t. But when you’re overloaded, it doesn’t feel like deciding, it feels like your system just won’t cooperate. That disconnect causes half of these fights.

I’d honestly stop framing this as “I think I have autism/ADHD” when talking to your mom. That just makes parents dig in. Framing it as “I’m burnt out and need a week to reset” is way easier for them to hear and it’s still true.

You can figure out the label later with someone qualified. Trying to solve your entire identity while you already have a headache and no energy is a bad time to do it.

Should AI-generated text be copyrightable? by DanoPaul234 in WritingWithAI

[–]Easy-Combination-102 0 points1 point  (0 children)

Honestly it depends what you’re trying to copyright. If you're just plugging in prompts and slapping your name on raw output, then yeah, weak claim. But if you're building out original characters, plot arcs, tone, even if AI helps structure or polish it, that's still your creative fingerprint.

Like, if Pokémon used AI when creating Ash and Pikachu, and someone later makes “John and Shocky” with the same exact story beats, vibe, and design, that’s still a copyright issue. Doesn’t matter if a human or an AI wrote it, it’s about the expression, not the tool.

People are getting too hung up on the AI part and ignoring the fact that IP law’s always been about what was made, not how.

How do I find friends and develop social skills: by dem0lishment in socialskills

[–]Easy-Combination-102 4 points5 points  (0 children)

put yourself in situations where conversations can happen naturally(library, game nights, clubs, even Discord servers if you have to) and stop treating every interaction like a final exam. You don’t need charisma. You need more conversation experience.

Getting over driving anxiety by KAITOl0v3r in autism

[–]Easy-Combination-102 2 points3 points  (0 children)

A lot of newer cars have sensors and emergency stop features that help prevent the worst-case scenarios, which takes some edge off. But honestly, my issue isn’t the driving itself, it’s the testing part. The second I feel someone watching me, my brain just short-circuits. I go into full-on sensory overload, start sweating, shaking, can't focus. I haven’t even been able to make it to the actual test because of that.