A text autoencoder from transformer for fast encoding language representation

11/04/2021
by   Tan Huang, et al.
0

In recent years BERT shows apparent advantages and great potential in natural language processing tasks. However, both training and applying BERT requires intensive time and resources for computing contextual language representations, which hinders its universality and applicability. To overcome this bottleneck, we propose a deep bidirectional language model by using window masking mechanism at attention layer. This work computes contextual language representations without random masking as does in BERT and maintains the deep bidirectional architecture like BERT. To compute the same sentence representation, our method shows O(n) complexity less compared to other transformer-based models with O(n^2). To further demonstrate its superiority, computing context language representations on CPU environments is conducted, by using the embeddings from the proposed method, logistic regression shows much higher accuracy in terms of SMS classification. Moverover, the proposed method also achieves significant higher performance in semantic similarity tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/17/2020

Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning

Even though BERT achieves successful performance improvements in various...
research
04/10/2021

Non-autoregressive Transformer-based End-to-end ASR using BERT

Transformer-based models have led to a significant innovation in various...
research
05/03/2020

An Accurate Model for Predicting the (Graded) Effect of Context in Word Similarity Based on Bert

Natural Language Processing (NLP) has been widely used in the semantic a...
research
02/27/2023

Elementwise Language Representation

We propose a new technique for computational language representation cal...
research
12/05/2019

Self-Supervised Contextual Language Representation of Radiology Reports to Improve the Identification of Communication Urgency

Machine learning methods have recently achieved high-performance in biom...
research
05/25/2022

Transcormer: Transformer for Sentence Scoring with Sliding Language Modeling

Sentence scoring aims at measuring the likelihood score of a sentence an...
research
11/16/2022

Fast and Accurate FSA System Using ELBERT: An Efficient and Lightweight BERT

As an application of Natural Language Processing (NLP) techniques, finan...

Please sign up or login with your details

Forgot password? Click here to reset