Neural Machine Transliteration: Preliminary Results

09/14/2016
by   Amir H. Jadidinejad, et al.
0

Machine transliteration is the process of automatically transforming the script of a word from a source language to a target language, while preserving pronunciation. Sequence to sequence learning has recently emerged as a new paradigm in supervised learning. In this paper a character-based encoder-decoder model has been proposed that consists of two Recurrent Neural Networks. The encoder is a Bidirectional recurrent neural network that encodes a sequence of symbols into a fixed-length vector representation, and the decoder generates the target sequence using an attention-based recurrent neural network. The encoder, the decoder and the attention mechanism are jointly trained to maximize the conditional probability of a target sequence given a source sequence. Our experiments on different datasets show that the proposed encoder-decoder model is able to achieve significantly higher transliteration quality over traditional statistical models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2014

Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation

In this paper, we propose a novel neural network model called RNN Encode...
research
11/13/2015

Sequence to Sequence Learning for Optical Character Recognition

We propose an end-to-end recurrent encoder-decoder based sequence learni...
research
07/02/2019

Learning to Reformulate the Queries on the WEB

Inability of the naive users to formulate appropriate queries is a funda...
research
04/03/2018

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

Celebrated Sequence to Sequence learning (Seq2Seq) and its fruitful vari...
research
10/11/2018

Piano Genie

We present Piano Genie, an intelligent controller which allows non-music...
research
09/03/2019

DeepObfusCode: Source Code Obfuscation Through Sequence-to-Sequence Networks

The paper explores a novel methodology in source code obfuscation throug...
research
09/18/2018

Learning Universal Sentence Representations with Mean-Max Attention Autoencoder

In order to learn universal sentence representations, previous methods f...

Please sign up or login with your details

Forgot password? Click here to reset