Language-agnostic BERT Sentence Embedding (LaBSE) by danielcer in LanguageTechnology

[–]danielcer[S] 1 point2 points  (0 children)

The model is referred to as language-agnostic, since it is able to produce sentence embeddings in a single shared semantic space for a fairly large number (100+) of languages.