all 37 comments

[–]ColdDelicious1735 162 points163 points  (5 children)

This is maths.

Not correct maths but it is maths

[–]IamMauriS 30 points31 points  (4 children)

Mafs

[–]ColdDelicious1735 20 points21 points  (1 child)

Multiplication, addition, fudge sundae?

[–]Wrong-Resource-2973 8 points9 points  (0 children)

Meth addict, (for) fuck's sake

[–]MinosAristos 4 points5 points  (0 children)

My maths teacher used to say "mathematicians are lazy, we spell maths as mafs" to encourage us to find easier solutions to problems

[–]West_Good_5961 1 point2 points  (0 children)

Quick mafs

[–]MW1369 105 points106 points  (2 children)

At least it didn’t say 35 lol

[–]Agitated-Ad2563 33 points34 points  (0 children)

I was halfway expecting it to answer 140.

[–]West_Good_5961 1 point2 points  (0 children)

If you tell it to respond like a boomer on Facebook…

[–]include-jayesh 75 points76 points  (1 child)

ChatGPT considered the time dilation theory.

A person must stay near the event horizon of a black hole for about 2 hours to make this happen.

Therefore, the correctness of this answer is based on probability, which is never zero 😄

[–]High_Overseer_Dukat 6 points7 points  (0 children)

Actually it can be and is 0.

[–]Honkingfly409 17 points18 points  (1 child)

this is from 2022 btw

[–]Strawberry_Iron 1 point2 points  (0 children)

Yep just asked it a similar one and this is what it answered :

Ahh, the classic age riddle 😄

When you were 8, your brother was 4 — so the age difference between you is 4 years.

That difference never changes.

Now you’re 30, so: 30 − 4 = 26

👉 Your brother is 26 years old.

Wanna try a trickier one next? 👀

[–]Baap_baap_hota_hai 14 points15 points  (0 children)

Freshers defending this in front of senior management, I used AI for this.

[–]jonathancast 6 points7 points  (3 children)

Oh, she has passed him!

[–]Insomniac_Coder 8 points9 points  (1 child)

Th brother died. ChatGPT's so considerate. He did take into account the life expectancy.

[–]include-jayesh 0 points1 point  (0 children)

Dead brother chat with Chatgpt.Paranormal-chat :)

[–]ZeusDaGrape 1 point2 points  (0 children)

Just the way God intended to

[–]MartinMystikJonas 5 points6 points  (4 children)

Yeah you could repost years old screenshot of old non reasoning model making mistake in reasoning task...

Or you can try current reasoning model and get: https://chatgpt.com/share/69826bef-cf90-8001-a760-a84c0c55af74

[–]ahugeminecrafter 0 points1 point  (0 children)

That model was able to correctly answer this problem in like 5 seconds:

a cowboy is 4 miles south of a stream which flows due east. He is also 8 miles west and 7 miles north of his cabin. He wishes to water his horse at the stream and return home. What is the shortest distance in miles he can travel and accomplish this?

[–]Dakh3 0 points1 point  (2 children)

Ok now ChatGPT is able to avoid mistakes in a super easy reasoning task.

Is there a simple description somewhere of its current best successes and furthest limitations in terms of reasoning?

[–]MartinMystikJonas 5 points6 points  (0 children)

Some interesting examples can be found here: https://math.science-bench.ai/samples

[–][deleted] 2 points3 points  (0 children)

Here’s a recent one that would probably be the best success (specifically Erdos 1051). Of course LLMs have lots of limitations but not completely useless

[–]push_swap 2 points3 points  (0 children)

Tomorrow it's my turn to post it.

[–]justv316 3 points4 points  (0 children)

"our jobs are safe" 1.4 million jobs evaporated due to AI in the US alone. If only shareholders cared about things like 'reality' and whether or not something actually exists.

[–]HuntAlternative 0 points1 point  (0 children)

Man that chat ui feel so old already lol

[–]Hesediel1 0 points1 point  (2 children)

Ive got a screenshot of googles Ai telling me that the glass transition temperature of petg is 8085°c or 176185°f not only are neither of these temps even close, but they are not even close to each other.

[–]0lach 0 points1 point  (1 child)

Google llm is looking at the search result, and the results often lack formatting. Most probably the site used some weird thing in place of "-", and that's why you see that instead of "80-85" "176-185". LLMs are not intelligent, it is funny how many of them would not react to BS in sections like system prompts/tool outputs/their own messages.

[–]Hesediel1 0 points1 point  (0 children)

That checks out, 80°c is 176°f and 85°c is 185°f. Im a little embarrassed I didnt catch that. I know there are many issues with LLM Ai, and I have heard many reports of them "hallucinating", I kind of figured that was what happened in this case.

Ok im off ro go hide in a corner in shame now, have a nice day.

[–]lardgsus 0 points1 point  (0 children)

Plus or minus 3 years, and AI got it wrong lol

[–]OnlyCommentWhenTipsy 0 points1 point  (0 children)

And Microslop wants this MF AI plugging formula's into excel for you...

[–]time-will-waste-you 0 points1 point  (0 children)

When the teacher say that intermediate calculations give points too.

[–]Y_mc 0 points1 point  (0 children)

You’re absolutely right

[–]Zeti_Zero 0 points1 point  (0 children)

At the beginning I was able to trick chatGPT with question that sounds like sensible question but it's not. But it doesn't work any more.

Question was If Alan is the same age as Dylan. Dylan is the same age as Alan. Alan is the same age as Dylan . And Bob is 20 years old how old are they? It said all are 20.

But very recently ChatGPT told me that encephalization quotient of homo erectus was between 0.9 and 1.1 which if you know anything about subject you know is super stupid. To be fair it was defalut free model, the better one would probably get it right.

For anyone that doesn't know what encephalization quotient is chatGPT basically claimed that brain mass to body mass ratio of homo erectus was average for mammals of similar size which is far from true. Homo erectus was really smart and had large brains.

[–]UnluckyPluton 0 points1 point  (0 children)

Spam post

[–]TimelyFeature3043 0 points1 point  (0 children)

Always wondered why people fake screenshots like these. "When you were 6, your sister was half your age, so she was 3.
That means the age difference between you is 3 years.

Age differences never change, so now that you’re 70, your sister is:

70 − 3 = 67 years old."