DeepAI
Log In Sign Up

Infusing Finetuning with Semantic Dependencies

12/10/2020
by   Zhaofeng Wu, et al.
0

For natural language processing systems, two kinds of evidence support the use of text representations from neural language models "pretrained" on large unannotated corpora: performance on application-inspired benchmarks (Peters et al., 2018, inter alia), and the emergence of syntactic abstractions in those representations (Tenney et al., 2019, inter alia). On the other hand, the lack of grounded supervision calls into question how well these representations can ever capture meaning (Bender and Koller, 2020). We apply novel probes to recent language models – specifically focusing on predicate-argument structure as operationalized by semantic dependencies (Ivanova et al., 2012) – and find that, unlike syntax, semantics is not brought to the surface by today's pretrained models. We then use convolutional graph encoders to explicitly incorporate semantic parses into task-specific finetuning, yielding benefits to natural language understanding (NLU) tasks in the GLUE benchmark. This approach demonstrates the potential for general-purpose (rather than task-specific) linguistic supervision, above and beyond conventional pretraining and finetuning. Several diagnostics help to localize the benefits of our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/24/2019

Do Massively Pretrained Language Models Make Better Storytellers?

Large neural language models trained on massive amounts of text have eme...
01/11/2019

Grammatical Analysis of Pretrained Sentence Encoders with Acceptability Judgments

Recent pretrained sentence encoders achieve state of the art results on ...
11/08/2019

Negated LAMA: Birds cannot fly

Pretrained language models have achieved remarkable improvements in a br...
04/25/2022

Natural Language to Code Translation with Execution

Generative models of code, pretrained on large corpora of programs, have...
06/14/2019

NLProlog: Reasoning with Weak Unification for Question Answering in Natural Language

Rule-based models are attractive for various tasks because they inherent...
03/20/2018

AllenNLP: A Deep Semantic Natural Language Processing Platform

This paper describes AllenNLP, a platform for research on deep learning ...