CoMER: Modeling Coverage for Transformer-based Handwritten Mathematical Expression Recognition

07/10/2022
by   Wenqi Zhao, et al.
0

The Transformer-based encoder-decoder architecture has recently made significant advances in recognizing handwritten mathematical expressions. However, the transformer model still suffers from the lack of coverage problem, making its expression recognition rate (ExpRate) inferior to its RNN counterpart. Coverage information, which records the alignment information of the past steps, has proven effective in the RNN models. In this paper, we propose CoMER, a model that adopts the coverage information in the transformer decoder. Specifically, we propose a novel Attention Refinement Module (ARM) to refine the attention weights with past alignment information without hurting its parallelism. Furthermore, we take coverage information to the extreme by proposing self-coverage and cross-coverage, which utilize the past alignment information from the current and previous layers. Experiments show that CoMER improves the ExpRate by 0.61 state-of-the-art model, and reaches 59.33 2014/2016/2019 test sets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset