Manifold Mixup improves text recognition with CTC loss

03/11/2019
by   Bastien Moysset, et al.
0

Modern handwritten text recognition techniques employ deep recurrent neural networks. The use of these techniques is especially efficient when a large amount of annotated data is available for parameter estimation. Data augmentation can be used to enhance the performance of the systems when data is scarce. Manifold Mixup is a modern method of data augmentation that meld two images or the feature maps corresponding to these images and the targets are fused accordingly. We propose to apply the Manifold Mixup to text recognition while adapting it to work with a Connectionist Temporal Classification cost. We show that Manifold Mixup improves text recognition results on various languages and datasets.

READ FULL TEXT
research
12/14/2021

Handwritten text generation and strikethrough characters augmentation

We introduce two data augmentation techniques, which, used with a Resnet...
research
07/27/2023

TextManiA: Enriching Visual Feature by Text-driven Manifold Augmentation

Recent label mix-based augmentation methods have shown their effectivene...
research
08/25/2022

Image augmentation improves few-shot classification performance in plant disease recognition

With the world population projected to near 10 billion by 2050, minimizi...
research
03/05/2023

A Study of Augmentation Methods for Handwritten Stenography Recognition

One of the factors limiting the performance of handwritten text recognit...
research
03/14/2020

Learn to Augment: Joint Data Augmentation and Network Optimization for Text Recognition

Handwritten text and scene text suffer from various shapes and distorted...
research
05/31/2023

MSMix:An Interpolation-Based Text Data Augmentation Method Manifold Swap Mixup

To solve the problem of poor performance of deep neural network models d...
research
05/14/2021

Out-of-Manifold Regularization in Contextual Embedding Space for Text Classification

Recent studies on neural networks with pre-trained weights (i.e., BERT) ...

Please sign up or login with your details

Forgot password? Click here to reset