Fast and Accurate FSA System Using ELBERT: An Efficient and Lightweight BERT

11/16/2022
by   Siyuan Lu, et al.
0

As an application of Natural Language Processing (NLP) techniques, financial sentiment analysis (FSA) has become an invaluable tool for investors. Its speed and accuracy can significantly impact the returns of trading strategies.With the development of deep learning and Transformer-based pre-trained models like BERT, the accuracy of FSA has been much improved, but these time-consuming big models will also slow down the computation. To boost the processing speed of the FSA system and ensure high precision, we first propose an efficient and lightweight BERT (ELBERT) along with a novel confidence-window-based (CWB) early exit mechanism. Based on ELBERT, an innovative method to accelerate text processing on the GPU platform is developed, solving the difficult problem of making the early exit mechanism work more effectively with a large input batch size. Afterward, a fast and high-accuracy FSA system is built. Experimental results show that the proposed CWB early exit mechanism achieves significantly higher accuracy than existing early exit methods on BERT under the same computation cost. Besides, our FSA system can boost the processing speed to over 1000 texts per second with sufficient accuracy by using this acceleration method, which is nearly twice as fast as the FastBERT. Hence, this system can enable modern trading systems to quickly and accurately process financial text data.

READ FULL TEXT

page 1

page 6

page 12

research
07/01/2021

Elbert: Fast Albert with Confidence-Window Based Early Exit

Despite the great success in Natural Language Processing (NLP) area, lar...
research
07/14/2021

Large-Scale News Classification using BERT Language Model: Spark NLP Approach

The rise of big data analytics on top of NLP increases the computational...
research
10/28/2022

BEBERT: Efficient and robust binary ensemble BERT

Pre-trained BERT models have achieved impressive accuracy on natural lan...
research
03/16/2023

SmartBERT: A Promotion of Dynamic Early Exiting Mechanism for Accelerating BERT Inference

Dynamic early exiting has been proven to improve the inference speed of ...
research
11/04/2021

A text autoencoder from transformer for fast encoding language representation

In recent years BERT shows apparent advantages and great potential in na...
research
01/29/2022

ScaLA: Accelerating Adaptation of Pre-Trained Transformer-Based Language Models via Efficient Large-Batch Adversarial Noise

In recent years, large pre-trained Transformer-based language models hav...
research
11/27/2020

CoRe: An Efficient Coarse-refined Training Framework for BERT

In recent years, BERT has made significant breakthroughs on many natural...

Please sign up or login with your details

Forgot password? Click here to reset