Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations
With the emerging research effort to integrate structured and unstructured knowledge, many approaches incorporate factual knowledge into pre-trained language models (PLMs) and apply the knowledge-enhanced PLMs on downstream NLP tasks. However, (1) they only consider static factual knowledge, but knowledge graphs (KGs) also contain temporal facts or events indicating evolutionary relationships among entities at different timestamps. (2) PLMs cannot be directly applied to many KG tasks, such as temporal KG completion. In this paper, we focus on enhancing temporal knowledge embeddings with contextualized language representations (ECOLA). We align structured knowledge contained in temporal knowledge graphs with their textual descriptions extracted from news articles and propose a novel knowledge-text prediction task to inject the abundant information from descriptions into temporal knowledge embeddings. ECOLA jointly optimizes the knowledge-text prediction objective and the temporal knowledge embeddings, which can simultaneously take full advantage of textual and knowledge information. For training ECOLA, we introduce three temporal KG datasets with aligned textual descriptions. Experimental results on the temporal knowledge graph completion task show that ECOLA outperforms state-of-the-art temporal KG models by a large margin. The proposed datasets can serve as new temporal KG benchmarks and facilitate future research on structured and unstructured knowledge integration.
READ FULL TEXT