Does Chinese BERT Encode Word Structure?

10/15/2020
by   Yile Wang, et al.
0

Contextualized representations give significantly improved results for a wide range of NLP tasks. Much work has been dedicated to analyzing the features captured by representative models such as BERT. Existing work finds that syntactic, semantic and word sense knowledge are encoded in BERT. However, little work has investigated word features for character-based languages such as Chinese. We investigate Chinese BERT using both attention weight distribution statistics and probing tasks, finding that (1) word information is captured by BERT; (2) word-level features are mostly in the middle representation layers; (3) downstream tasks make different use of word features in BERT, with POS tagging and chunking relying the most on word features, and natural language inference relying the least on such features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2020

SBERT-WK: A Sentence Embedding Method by Dissecting BERT-based Word Models

Sentence embedding is an important research topic in natural language pr...
research
10/20/2014

Supervised mid-level features for word image representation

This paper addresses the problem of learning word image representations:...
research
11/17/2020

MVP-BERT: Redesigning Vocabularies for Chinese BERT and Multi-Vocab Pretraining

Despite the development of pre-trained language models (PLMs) significan...
research
09/20/2019

BERT Meets Chinese Word Segmentation

Chinese word segmentation (CWS) is a fundamental task for Chinese langua...
research
09/18/2023

Proposition from the Perspective of Chinese Language: A Chinese Proposition Classification Evaluation Benchmark

Existing propositions often rely on logical constants for classification...
research
10/24/2022

Explaining Translationese: why are Neural Classifiers Better and what do they Learn?

Recent work has shown that neural feature- and representation-learning, ...
research
04/13/2021

Mediators in Determining what Processing BERT Performs First

Probing neural models for the ability to perform downstream tasks using ...

Please sign up or login with your details

Forgot password? Click here to reset