We present SPECTRON, a novel approach to adapting pre-trained language m...
End-to-end (E2E) spoken language understanding (SLU) is constrained by t...
Large pretrained language models (PLMs) are often domain- or task-adapte...
Non-autoregressive models greatly improve decoding speed over typical
se...
We describe an unsupervised method to create pseudo-parallel corpora for...
Pre-trained multilingual contextual embeddings have demonstrated
state-o...
We discuss the problem of echographic transcription in autoregressive
se...
We propose a novel approach to semi-supervised automatic speech recognit...
We rerank with scores from pretrained masked language models like BERT t...
We evaluate three simple, normalization-centric changes to improve
Trans...
Pretrained contextual word representations in NLP have greatly improved
...
Self-attention has demonstrated great success in sequence-to-sequence ta...