
StackSeq2Seq: Dual Encoder Seq2Seq Recurrent Networks
A widely studied nondeterministic polynomial time (NP) hard problem lie...
read it

Approximating metaheuristics with homotopic recurrent neural networks
Much combinatorial optimisation problems constitute a nonpolynomial (NP...
read it

GeoSeq2Seq: Information Geometric SequencetoSequence Networks
The Fisher information metric is an important foundation of information ...
read it

Faster Heuristics for Graph Burning
Graph burning is a process of information spreading through the network ...
read it

Generating Shortest Synchronizing Sequences using Answer Set Programming
For a finite state automaton, a synchronizing sequence is an input seque...
read it

MultiTemporal Land Cover Classification with Sequential Recurrent Encoders
Earth observation (EO) sensors deliver data with daily or weekly tempora...
read it
Sequence stacking using dual encoder Seq2Seq recurrent networks
A widely studied nonpolynomial (NP) hard problem lies in finding a route between the two nodes of a graph. Often metaheuristics algorithms such as A^* are employed on graphs with a large number of nodes. Here, we propose a deep recurrent neural network architecture based on the Sequence2Sequence model, widely used, for instance in text translation. Particularly, we illustrate that utilising a context vector that has been learned from two different recurrent networks enables increased accuracies in learning the shortest route of a graph. Additionally, we show that one can boost the performance of the Seq2Seq network by smoothing the loss function using a homotopy continuation of the decoder's loss function.
READ FULL TEXT
Comments
There are no comments yet.