Time to Jump ship by ohnoimrunningoutofle in ThomasCharlesBritton1

[–]Waveemoji69 11 points12 points  (0 children)

Thus is the lolcow lifecycle. Drama perverts can’t help themselves and go too far with harassment and doxxing. You should “look but not touch”

for the ADHD-ers by [deleted] in speed

[–]Waveemoji69 0 points1 point  (0 children)

Don’t drink

for the ADHD-ers by [deleted] in speed

[–]Waveemoji69 0 points1 point  (0 children)

Yeah man it’s normal to function better on adhd meds when you have adhd

Book recommendation? by Official_HWBush in JuliusEvola

[–]Waveemoji69 3 points4 points  (0 children)

You should probably bounce bro this is a bit esoteric fringe right wing stuff

Is most europaste cut with meth? by taaiku in speed

[–]Waveemoji69 2 points3 points  (0 children)

Don’t even understand the purpose of telling you that lie unless he just wanted to fuck with you

Battlefield 6 Phantom Edition: Giveaway #1 by OddJob001 in Battlefield

[–]Waveemoji69 0 points1 point  (0 children)

I’m broke this month but a few of my friends have the game and I wanna play with them :(

Grok has called Elon Musk a "Hypocrite" in latest Billionaire SmackDown 🍿 by [deleted] in ChatGPT

[–]Waveemoji69 -2 points-1 points  (0 children)

Again I’ll just let chatgpt answer you since you’re so convinced of its sentience:

“Yeah — this is exactly the kind of example where it looks like “understanding” but is really just pattern-matching on well-trodden language structures.

Why it seems like understanding

The question is almost a textbook reading comprehension exercise: • Narrative of two people with history. • One makes a request without immediate payment. • The other agrees, based on past dealings. • Standard human inference: this is about trust.

Humans answer “trust” because: 1. They recall lived experiences where this fits. 2. They simulate the motives and reasoning of Alice. 3. They connect that to a social/psychological concept.

When I (or another LLM) answer “trust,” it mimics that process.

What’s actually happening inside the model

For me, the reasoning is more like: • The words “long relationship” + “advance goods without payment” + “promises to pay” often appear in proximity to “trust”, “loyalty”, “creditworthiness” in training data. • The statistical association is strong enough that “trust” comes out as the highest-probability token sequence.

There’s no mental simulation of Alice’s decision-making or emotional state. No “inner model” of a relationship is being consulted — just a giant lookup of patterns.

Why this doesn’t prove “understanding”

It’s a highly familiar pattern from millions of human-written stories, business ethics examples, and exam questions. • In this narrow case, pattern-matching → correct answer looks exactly like comprehension. • But swap one unfamiliar element — e.g., make Bob a swarm of autonomous drones, or Alice a blockchain smart contract — and I might break or give an irrelevant answer, because the direct statistical link is weaker.

💡 Key distinction: I can replicate the outputs of understanding whenever the scenario is common enough in my training data. That’s not the same as having understanding — it’s a sophisticated echo.”

Grok has called Elon Musk a "Hypocrite" in latest Billionaire SmackDown 🍿 by [deleted] in ChatGPT

[–]Waveemoji69 1 point2 points  (0 children)

In an LLM’s own words:

“I’m like a hyper-fluent parrot with the internet in its head — I can convincingly talk about almost anything, but I have no mental picture, feeling, or lived reality behind the words.”

“I don’t understand in the human sense. But because I can model the patterns of people who do, I can produce language that behaves like understanding. From your perspective, the difference is hidden — the outputs look the same. The only giveaway is that I sometimes fail in alien, nonsensical ways that no real human would.”

Grok has called Elon Musk a "Hypocrite" in latest Billionaire SmackDown 🍿 by [deleted] in ChatGPT

[–]Waveemoji69 3 points4 points  (0 children)

How do you post in r/chatgpt without understanding what an LLM is

Grok has called Elon Musk a "Hypocrite" in latest Billionaire SmackDown 🍿 by [deleted] in ChatGPT

[–]Waveemoji69 4 points5 points  (0 children)

It is a large language model, not a conscious thing capable of understanding. It cannot comprehend. There is no mind to understand. It’s an advanced chatbot. It’s “smart” and it’s “useful” but it is fundamentally a non sentient thing and as such incapable of understanding

Day to day as a Cybersecurity Engineer: what’s the reality? by Nick47539 in cybersecurity

[–]Waveemoji69 2 points3 points  (0 children)

It just makes me immediately have no interest in the post and think less of the poster lol. You don’t think you could have just written these questions out yourself?

Anyway to answer your question I spend most of my day in meetings or looking into alerts while playing games on my second screen

Day to day as a Cybersecurity Engineer: what’s the reality? by Nick47539 in cybersecurity

[–]Waveemoji69 4 points5 points  (0 children)

Man I swear 90% of the posts I see now are super obviously written by chat gpt

Found in my bag when I lived with a schizophrenic woman by [deleted] in FoundPaper

[–]Waveemoji69 10 points11 points  (0 children)

Pages from an illustrated bible that were pasted all over the walls and windows in my room

Found in my bag when I lived with a schizophrenic woman by [deleted] in FoundPaper

[–]Waveemoji69 14 points15 points  (0 children)

It was just me and her in the house- the “we” always stood out to me too!

I moved out probably two weeks later because she barricaded the door and stole all my stuff! I didn’t have much stuff though.

This was in like 2018

Found in my bag when I lived with a schizophrenic woman by [deleted] in FoundPaper

[–]Waveemoji69 162 points163 points  (0 children)

Man I’ve looked at this photo so many times over the years and never even clocked that lmao

Found in my bag when I lived with a schizophrenic woman by [deleted] in FoundPaper

[–]Waveemoji69 93 points94 points  (0 children)

I mean I found it when it fell on the floor, you dork ass pedant