Probing the phonetic and phonological knowledge of tones in Mandarin TTS models

12/23/2019
by   Jian Zhu, et al.
0

This study probes the phonetic and phonological knowledge of lexical tones in TTS models through two experiments. Controlled stimuli for testing tonal coarticulation and tone sandhi in Mandarin were fed into Tacotron 2 and WaveGlow to generate speech samples, which were subject to acoustic analysis and human evaluation. Results show that both baseline Tacotron 2 and Tacotron 2 with BERT embeddings capture the surface tonal coarticulation patterns well but fail to consistently apply the Tone-3 sandhi rule to novel sentences. Incorporating pre-trained BERT embeddings into Tacotron 2 improves the naturalness and prosody performance, and yields better generalization of Tone-3 sandhi rules to novel complex sentences, although the overall accuracy for Tone-3 sandhi was still low. Given that TTS models do capture some linguistic phenomena, it is argued that they can be used to generate and validate certain linguistic hypotheses. On the other hand, it is also suggested that linguistically informed stimuli should be included in the training and the evaluation of TTS models.

READ FULL TEXT
research
01/16/2019

Assessing BERT's Syntactic Abilities

I assess the extent to which the recently introduced BERT model captures...
research
03/02/2022

Discontinuous Constituency and BERT: A Case Study of Dutch

In this paper, we set out to quantify the syntactic capacity of BERT in ...
research
10/23/2020

GiBERT: Introducing Linguistic Knowledge into BERT through a Lightweight Gated Injection Method

Large pre-trained language models such as BERT have been the driving for...
research
09/22/2021

Alzheimers Dementia Detection using Acoustic Linguistic features and Pre-Trained BERT

Alzheimers disease is a fatal progressive brain disorder that worsens wi...
research
09/14/2021

Frequency Effects on Syntactic Rule Learning in Transformers

Pre-trained language models perform well on a variety of linguistic task...
research
09/13/2021

Old BERT, New Tricks: Artificial Language Learning for Pre-Trained Language Models

We extend the artificial language learning experimental paradigm from ps...
research
09/17/2020

Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA

Many NLP tasks have benefited from transferring knowledge from contextua...

Please sign up or login with your details

Forgot password? Click here to reset