[PDF][PDF] Linguistic regularities in sparse and explicit word representations

O Levy, Y Goldberg - Proceedings of the eighteenth conference on …, 2014 - aclanthology.org
Proceedings of the eighteenth conference on computational natural …, 2014aclanthology.org
Recent work has shown that neuralembedded word representations capture many relational
similarities, which can be recovered by means of vector arithmetic in the embedded space.
We show that Mikolov et al.'s method of first adding and subtracting word vectors, and then
searching for a word similar to the result, is equivalent to searching for a word that
maximizes a linear combination of three pairwise word similarities. Based on this
observation, we suggest an improved method of recovering relational similarities, improving …
Abstract
Recent work has shown that neuralembedded word representations capture many relational similarities, which can be recovered by means of vector arithmetic in the embedded space. We show that Mikolov et al.’s method of first adding and subtracting word vectors, and then searching for a word similar to the result, is equivalent to searching for a word that maximizes a linear combination of three pairwise word similarities. Based on this observation, we suggest an improved method of recovering relational similarities, improving the state-of-the-art results on two recent word-analogy datasets. Moreover, we demonstrate that analogy recovery is not restricted to neural word embeddings, and that a similar amount of relational similarities can be recovered from traditional distributional word representations.
aclanthology.org