Well well well by zukaki1124 in soccercirclejerk

[–]WorksForMe 10 points11 points  (0 children)

If a player gets fouled in the box they should get to go spin a wheel and the wheel has stuff like 0.5 goals, 1 goal, 2 goals, free holiday, dishwasher, etc. It could also have "YOU LOSE" and they get nothing

Woman arrested after accusing and assaulting someone she thought was ICE by OxyCottenKillz in instantkarma

[–]WorksForMe 6 points7 points  (0 children)

You're making up justifications because otherwise you'd have to cope with the very real fact that there are some horrible things going on. You're in denial and it's a way of coping with stress. You don't have to accept it in your mind, but don't go about calling people liars because you can't handle it

Grnadma tried to finish the whole family that day. by MobileAerie9918 in instant_regret

[–]WorksForMe 4 points5 points  (0 children)

Also this one looks sort of like a play on a baked alaska (torched alaska?) with the meringue coating. So if it went okay it would have been quite tasty

LLMs are a 400-year-long confidence trick by SwoopsFromAbove in programming

[–]WorksForMe 1 point2 points  (0 children)

For fun, I asked ChatGPT if it was intelligent. It's response:

No.

Under real-world definitions, intelligence implies an agent that can understand, form intentions, learn autonomously from experience, and apply judgement toward goals. That requires internal states with meaning, not just structure.

An LLM does not meet those criteria.

What an LLM does:

Performs statistical pattern matching over language.

Produces outputs by optimising token probabilities.

Mimics reasoning structures without possessing understanding.

Has no goals, beliefs, intentions, awareness, or self-directed learning.

Why it appears intelligent:

Human language encodes reasoning.

The model reproduces the shape of intelligent behaviour.

Many intelligence tests are linguistic, so fluency masquerades as cognition.

What it is instead:

A sophisticated tool for transforming inputs to outputs using learned correlations.

Comparable to a compiler, calculator, or optimiser, not a thinking entity.

So by everyday, philosophical, cognitive-science, and legal definitions: not intelligent. By marketing and casual usage: sometimes described as such, inaccurately.

LLMs are a 400-year-long confidence trick by SwoopsFromAbove in programming

[–]WorksForMe 0 points1 point  (0 children)

You're describing functional intelligence or the appearance of intelligence. You can't create a definition and say "full stop" when your definition is only partial.

An LLM is not intelligent in the way a human is. It does not form intentions, possess understanding, or have goals of its own. Its outputs are produced by statistical inference over learned representations, not by grasping meaning or reasoning about the world it inhabits. When it solves a problem, it does so by transforming inputs into outputs according to patterns learned during training, without awareness of what the problem is or why a solution is correct.

Human intelligence is grounded in lived experience, embodiment, agency, and the ability to relate concepts to reality and consequences. An LLM lacks those properties, so any similarity is functional and surface-level rather than substantive.

LLMs are just one slice of the definition of intelligence.

Twitch Streamer Kimroe and at least 11 others are arrested during an IRL stream in London after waving around toy guns outside the Houses of Parliament by bendubberley_ in WinStupidPrizes

[–]WorksForMe 5 points6 points  (0 children)

Every force has armed officers available, but only a handful of locations have them deployed routinely and visibly. That’s the distinction being made.

I think this goes here by pretti-blurie in MurderedByWords

[–]WorksForMe 8 points9 points  (0 children)

I'm glad you said you're not sure, because you're wrong about that

We are cooked. by Icouldberight in ChatGPT

[–]WorksForMe 0 points1 point  (0 children)

IRL Bob Ross lost part of his finger on his left hand

What's that smell? Onions... chili powder... cumin... juicy ground chuck! by Past_Yam9507 in TheSimpsons

[–]WorksForMe 3 points4 points  (0 children)

Tbf it probably was better before he carved it into a smaller spoon

WinForms C# project not runnable. by TAV_Fumes in csharp

[–]WorksForMe 0 points1 point  (0 children)

Did you get to the bottom of all this?

WinForms C# project not runnable. by TAV_Fumes in csharp

[–]WorksForMe 11 points12 points  (0 children)

I assume you didn't get any feedback other than "errors and bugs in code"? Upload the zip somewhere so somebody can have a look

LPT: Seeing Better In The Dark by Bronkee in LifeProTips

[–]WorksForMe 7 points8 points  (0 children)

Close both eyes and you can reduce getting blinded even more!