Solving Historical Dictionary Codes with a Neural Language Model

10/09/2020
by   Christopher Chu, et al.
0

We solve difficult word-based substitution codes by constructing a decoding lattice and searching that lattice with a neural language model. We apply our method to a set of enciphered letters exchanged between US Army General James Wilkinson and agents of the Spanish Crown in the late 1700s and early 1800s, obtained from the US Library of Congress. We are able to decipher 75.1 cipher-word tokens correctly.

READ FULL TEXT

page 4

page 9

research
03/13/2018

Neural Lattice Language Models

In this work, we propose a new language modeling paradigm that has the a...
research
04/23/2018

Spell Once, Summon Anywhere: A Two-Level Open-Vocabulary Language Model

We show how to deploy recurrent neural networks within a hierarchical Ba...
research
01/04/2020

Transformer-based language modeling and decoding for conversational speech recognition

We propose a way to use a transformer-based language model in conversati...
research
05/25/2022

Segmenting Numerical Substitution Ciphers

Deciphering historical substitution ciphers is a challenging problem. Ex...
research
04/06/2021

LT-LM: a novel non-autoregressive language model for single-shot lattice rescoring

Neural network-based language models are commonly used in rescoring appr...
research
07/01/2019

LSTM Language Models for LVCSR in First-Pass Decoding and Lattice-Rescoring

LSTM based language models are an important part of modern LVCSR systems...
research
04/12/2019

IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation

This paper describes our submission system for the Shallow Track of Surf...

Please sign up or login with your details

Forgot password? Click here to reset