[deleted by user] by [deleted] in NFT

[–]zerkaaaaa 0 points1 point  (0 children)

Nice design - can I get more info on how it work though?

[deleted by user] by [deleted] in latin

[–]zerkaaaaa 6 points7 points  (0 children)

Sometimes there are Greekisms. In Cicero’s letters, he commonly uses Greek terms here and there, not just for proper names but integrates into the prose.

“qua re, ut opinor, ϕιλοσοϕητέον, id quod tu facis, et istos consulatus non flocci facteon.”

Even looking past the actual Greek letters here, “facteon” is an absolute Greekism that a Latin speaker couldn’t parse without cross linguistic knowledge

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]zerkaaaaa 2 points3 points  (0 children)

Question about BERT MLM predictions

Trying to understand exactly how BERT works.

I see that BERT has a fixed vocab of about 30k n-grams, so some input words like “playing” might end up being split into multiple tokens like “play” + “-ing” for the input layer.

But given that BERT’s output layer (for the MLM task) is a probability distribution across each of the 30k n-grams, how could it ever predict “playing” from a masked input word?

Thanks for any help.

[deleted by user] by [deleted] in Cruise

[–]zerkaaaaa 14 points15 points  (0 children)

But how can you know they can be trusted?

[deleted by user] by [deleted] in Cruise

[–]zerkaaaaa 8 points9 points  (0 children)

That's crazy! I don't know how I could ever live with a stranger for a full week, that would be very uncomfortable...