DeepAI
Log In Sign Up

Syntax Representation in Word Embeddings and Neural Networks – A Survey

10/02/2020
by   Tomasz Limisiewicz, et al.
0

Neural networks trained on natural language processing tasks capture syntax even though it is not provided as a supervision signal. This indicates that syntactic analysis is essential to the understating of language in artificial intelligence systems. This overview paper covers approaches of evaluating the amount of syntactic information included in the representations of words for different neural network architectures. We mainly summarize re-search on English monolingual data on language modeling tasks and multilingual data for neural machine translation systems and multilingual language models. We describe which pre-trained models and representations of language are best suited for transfer to syntactic tasks.

READ FULL TEXT
03/01/2021

Vyākarana: A Colorless Green Benchmark for Syntactic Evaluation in Indic Languages

While there has been significant progress towards developing NLU dataset...
10/23/2022

DALL-E 2 Fails to Reliably Capture Common Syntactic Processes

Machine intelligence is increasingly being linked to claims about sentie...
10/02/2014

Not All Neural Embeddings are Born Equal

Neural language models learn word representations that capture rich ling...
05/11/2018

Deep RNNs Encode Soft Hierarchical Syntax

We present a set of experiments to demonstrate that deep recurrent neura...
04/19/2022

Multilingual Syntax-aware Language Modeling through Dependency Tree Conversion

Incorporating stronger syntactic biases into neural language models (LMs...
11/11/2020

Multilingual Irony Detection with Dependency Syntax and Neural Models

This paper presents an in-depth investigation of the effectiveness of de...
10/31/2022

Emergent Linguistic Structures in Neural Networks are Fragile

Large language models (LLMs) have been reported to have strong performan...