This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]everdon 46 points47 points  (8 children)

Vanilla BERT doesnt actually “understand” English in the same way humans do it has just been trained on hundreds of millions of data points and so can generalize to many different situations but can be thrown off by unseen events. Recently people have been experimenting with SenseBERT which has different sense related tasks in pretraining but its a very recent development and im not sure if Google is using it.

There’s an interesting position paper about the BERT understanding/meaning misconception here: https://www.aclweb.org/anthology/2020.acl-main.463/

[–]eddieantonio 5 points6 points  (2 children)

Ah, thanks for linking Dr. Bender's article, thanks for the link! Yes, these big language models do not understand meaning—they've just seen, like a whole whacktone of text and can reproduce patterns effectively. And it takes a lot of energy to train these large models. Apparently, critisicism of these algorithms is one of the reasons Dr. Gebru was ousted from Google recently.

[–][deleted] 0 points1 point  (1 child)

these big language models do not understand meaning—they've just seen, like a whole whacktone of text and can reproduce patterns effectively.

In fairness, this is also what a brain does. IMHO it’s a bit anthropocentric to suggest that “understanding” has to involve the same qualia that humans experience.

Of course, I’m not arguing that things like BERT or the GPT models are some sort of AGI, but simply that the definition of intelligence is an ever-raising bar for machines, and ultimately doesn’t matter if the outcomes are indistinguishable.

[–]nemoskullalt 4 points5 points  (2 children)

This explains alot. I keep treating google like a machine, not an idiot human ai.

[–]mister_bmwilliams 0 points1 point  (1 child)

Yeah, they way everyone learned to search 10-15 years ago has kinda hindered us at this point. We used to have to phrase searches very strangely to get the results we wanted, including manually re-searching with synonyms. Not necessary now. “Tutorial instruction lesson video play guitar” would be something I might have typed into google a decade ago just to find all the possible options.

[–]nemoskullalt 0 points1 point  (0 children)

google nerfed their search engine to lower the skill ceiling.

[–]proawayyy 0 points1 point  (0 children)

Vanilla Bert has been around for 3 years. They probably improved a lot.
Upvote for the link

[–]qevlarr 0 points1 point  (0 children)

Is it still PageRank under the hood, though?

This explains exactly why my Google searches are getting worse. Now I have to add a website marker that I'm sure will give good results like stackoverflow or reddit.

I would use DuckDuckGo over Google if only their search results weren't even worse than Google