When BERT Meets Quantum Temporal Convolution Learning for Text Classification in Heterogeneous Computing

02/17/2022
by   Chao-Han Huck Yang, et al.
0

The rapid development of quantum computing has demonstrated many unique characteristics of quantum advantages, such as richer feature representation and more secured protection on model parameters. This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification. In particular, our proposed hybrid classical-quantum model consists of a novel random quantum temporal convolution (QTC) learning framework replacing some layers in the BERT-based decoder. Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets. Particularly, the BERT-QTC boosts the performance of the existing quantum circuit-based language model in two text classification datasets by 1.57 deployed on both existing commercial-accessible quantum computation hardware and CPU-based interface for ensuring data isolation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset