MVP-BERT: Redesigning Vocabularies for Chinese BERT and Multi-Vocab Pretraining

11/17/2020
by   Wei Zhu, et al.
0

Despite the development of pre-trained language models (PLMs) significantly raise the performances of various Chinese natural language processing (NLP) tasks, the vocabulary for these Chinese PLMs remain to be the one provided by Google Chinese Bert <cit.>, which is based on Chinese characters. Second, the masked language model pre-training is based on a single vocabulary, which limits its downstream task performances. In this work, we first propose a novel method, seg_tok, to form the vocabulary of Chinese BERT, with the help of Chinese word segmentation (CWS) and subword tokenization. Then we propose three versions of multi-vocabulary pretraining (MVP) to improve the models expressiveness. Experiments show that: (a) compared with char based vocabulary, seg_tok does not only improves the performances of Chinese PLMs on sentence level tasks, it can also improve efficiency; (b) MVP improves PLMs' downstream performance, especially it can improve seg_tok's performances on sequence labeling tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2022

StyleBERT: Chinese pretraining by font style information

With the success of down streaming task using English pre-trained langua...
research
03/12/2022

MarkBERT: Marking Word Boundaries Improves Chinese BERT

We present a Chinese BERT model dubbed MarkBERT that uses word informati...
research
11/21/2022

TCBERT: A Technical Report for Chinese Topic Classification BERT

Bidirectional Encoder Representations from Transformers or BERT <cit.> h...
research
06/01/2021

SHUOWEN-JIEZI: Linguistically Informed Tokenizers For Chinese Language Model Pretraining

Conventional tokenization methods for Chinese pretrained language models...
research
04/26/2022

Pretraining Chinese BERT for Detecting Word Insertion and Deletion Errors

Chinese BERT models achieve remarkable progress in dealing with grammati...
research
04/24/2023

Semantic Tokenizer for Enhanced Natural Language Processing

Traditionally, NLP performance improvement has been focused on improving...
research
10/15/2020

Does Chinese BERT Encode Word Structure?

Contextualized representations give significantly improved results for a...

Please sign up or login with your details

Forgot password? Click here to reset