Enhancing Language Representation with Constructional Information for Natural Language Understanding

06/05/2023
by   Lvxiaowei Xu, et al.
0

Natural language understanding (NLU) is an essential branch of natural language processing, which relies on representations generated by pre-trained language models (PLMs). However, PLMs primarily focus on acquiring lexico-semantic information, while they may be unable to adequately handle the meaning of constructions. To address this issue, we introduce construction grammar (CxG), which highlights the pairings of form and meaning, to enrich language representation. We adopt usage-based construction grammar as the basis of our work, which is highly compatible with statistical models such as PLMs. Then a HyCxG framework is proposed to enhance language representation through a three-stage solution. First, all constructions are extracted from sentences via a slot-constraints approach. As constructions can overlap with each other, bringing redundancy and imbalance, we formulate the conditional max coverage problem for selecting the discriminative constructions. Finally, we propose a relational hypergraph attention network to acquire representation from constructional information by capturing high-order word interactions among constructions. Extensive experiments demonstrate the superiority of the proposed model on a variety of NLU tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2023

Construction Grammar Provides Unique Insight into Neural Language Models

Construction Grammar (CxG) has recently been used as the basis for probi...
research
02/16/2021

Have Attention Heads in BERT Learned Constituency Grammar?

With the success of pre-trained language models in recent years, more an...
research
02/24/2022

Neural reality of argument structure constructions

In lexicalist linguistic theories, argument structure is assumed to be p...
research
11/14/2019

Sparse associative memory based on contextual code learning for disambiguating word senses

In recent literature, contextual pretrained Language Models (LMs) demons...
research
10/08/2020

Dual Inference for Improving Language Understanding and Generation

Natural language understanding (NLU) and Natural language generation (NL...
research
11/09/2020

CxGBERT: BERT meets Construction Grammar

While lexico-semantic elements no doubt capture a large amount of lingui...
research
11/25/2022

Exposure and Emergence in Usage-Based Grammar: Computational Experiments in 35 Languages

This paper uses computational experiments to explore the role of exposur...

Please sign up or login with your details

Forgot password? Click here to reset