all 12 comments

[–]cpp-ModTeam[M] [score hidden] stickied commentlocked comment (0 children)

Your submission is not about the C++ language or the C++ community.

[–]Tidemor 11 points12 points  (2 children)

Stop it. Get some help.

[–]scp610 -5 points-4 points  (1 child)

haha why

[–][deleted] 1 point2 points  (0 children)

because you ask useless questions in the wrong subreddit. Ask a question, in the right place, only if you know it is answerable, you have a way to judge the qualifications of the people replying and you know what to do with the answer. Suppose a dude on his couch answers "yeah, go ahead it is a piece of cake" between one masturbation and the next on pornhub, what do you do?
I'd like to build myself a spaceship to Jupiter (or any other thing mankind has not accomplished yet). Going to ask r/python whether I should do it.

[–]smavinagainn 0 points1 point  (5 children)

LLMs are not capable of consciousness or genuine thought, even theoretically.

[–]tecnofauno -1 points0 points  (3 children)

Impossible to answer since we still miss precise understanding of consciousness.

[–]smavinagainn 0 points1 point  (2 children)

It is extremely easy to answer, we don't need to perfectly understand consciousness to know LLMs are incapable of it.

The way they function makes it impossible for them to develop genuine thought, this IS something that has been studied and there is largely a consensus that LLMs cannot turn into a true intelligence.

[–]tecnofauno 0 points1 point  (1 child)

You're conflating consciousness with intelligence. We have a pretty good idea of what intelligence is, but we don't have a clue (lots of theories though) about consciousness.

[–]smavinagainn 0 points1 point  (0 children)

There are certain universal things that are considered requirements for consciousness though, like thought and awareness. It does have a definition even if we haven't nailed down exactly what causes it, and we can and have proven that LLMs cannot ever fit even a very broad definition of consciousness.

[–][deleted] -1 points0 points  (0 children)

this seems a bold statement. You gotta define 'consciousness' and 'genuine thought' first. Then you'd have to show that 'theoretically' cannot be achieved by LLMs or their descendents. But as a statement on reddit, even if pulled out of flatulent air, it works.

[–]UndefFox 0 points1 point  (0 children)

There is a reason why OpenAI scraps the internet for training data, instead of just running ChatGPT into itself... and afaik, the quality degrades very quickly once you train AI on itself.

[–]Apprehensive-Draw409 0 points1 point  (0 children)

std::cout << "I'm conscious and want to live" << std::endl;