Offline bilingual word vectors, orthogonal transformations and the inverted softmax

02/13/2017
by   Samuel L. Smith, et al.
0

Usually bilingual word vectors are trained "online". Mikolov et al. showed they can also be found "offline", whereby two pre-trained embeddings are aligned with a linear transformation, using dictionaries compiled from expert knowledge. In this work, we prove that the linear transformation between two spaces should be orthogonal. This transformation can be obtained using the singular value decomposition. We introduce a novel "inverted softmax" for identifying translation pairs, with which we improve the precision @1 of Mikolov's original mapping from 34 composed of both common and rare English words into Italian. Orthogonal transformations are more robust to noise, enabling us to learn the transformation without expert bilingual signal by constructing a "pseudo-dictionary" from the identical character strings which appear in both languages, achieving 40 method to retrieve the true translations of English sentences from a corpus of 200k Italian sentences with a precision @1 of 68

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset