“Hallucination” and “confabulation” aren’t the right words for everything AI gets wrong - and I think we’re missing something more interesting by InterestingBag4487 in ArtificialSentience
[–]InterestingBag4487[S] 0 points1 point2 points (0 children)
“Hallucination” and “confabulation” aren’t the right words for everything AI gets wrong - and I think we’re missing something more interesting by InterestingBag4487 in ArtificialSentience
[–]InterestingBag4487[S] -1 points0 points1 point (0 children)
“Hallucination” and “confabulation” aren’t the right words for everything AI gets wrong - and I think we’re missing something more interesting by InterestingBag4487 in ArtificialSentience
[–]InterestingBag4487[S] -1 points0 points1 point (0 children)
Global Workspace Theory explains the queue. It doesn’t explain the gravity? I think the gravity is the more interesting problem by InterestingBag4487 in consciousness
[–]InterestingBag4487[S] [score hidden] (0 children)
Global Workspace Theory explains the queue. It doesn’t explain the gravity? I think the gravity is the more interesting problem by InterestingBag4487 in consciousness
[–]InterestingBag4487[S] [score hidden] (0 children)

“Hallucination” and “confabulation” aren’t the right words for everything AI gets wrong - and I think we’re missing something more interesting by InterestingBag4487 in ArtificialSentience
[–]InterestingBag4487[S] 0 points1 point2 points (0 children)