Evaluating the Utility of Document Embedding Vector Difference for Relation Learning
Recent work has demonstrated that vector offsets obtained by subtracting pretrained word embedding vectors can be used to predict lexical relations with surprising accuracy. Inspired by this finding, in this paper, we extend the idea to the document level, in generating document-level embeddings, calculating the distance between them, and using a linear classifier to classify the relation between the documents. In the context of duplicate detection and dialogue act tagging tasks, we show that document-level difference vectors have utility in assessing document-level similarity, but perform less well in multi-relational classification.
READ FULL TEXT