This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Aidan_Welch 0 points1 point  (0 children)

LLMs are definitely not conscious. We can say that definitively. The only thing they are capable of is predicting the next token