Optimizing Recurrent Neural Networks Architectures under Time Constraints

08/29/2016
by   Junqi Jin, et al.
0

Recurrent neural network (RNN)'s architecture is a key factor influencing its performance. We propose algorithms to optimize hidden sizes under running time constraint. We convert the discrete optimization into a subset selection problem. By novel transformations, the objective function becomes submodular and constraint becomes supermodular. A greedy algorithm with bounds is suggested to solve the transformed problem. And we show how transformations influence the bounds. To speed up optimization, surrogate functions are proposed which balance exploration and exploitation. Experiments show that our algorithms can find more accurate models or faster models than manually tuned state-of-the-art and random search. We also compare popular RNN architectures using our algorithms.

READ FULL TEXT
research
06/29/2021

Reliable and Fast Recurrent Neural Network Architecture Optimization

This article introduces Random Error Sampling-based Neuroevolution (RESN...
research
11/26/2019

Optimization of Chance-Constrained Submodular Functions

Submodular optimization plays a key role in many real-world problems. In...
research
02/09/2019

Contextual Recurrent Neural Networks

There is an implicit assumption that by unfolding recurrent neural netwo...
research
12/20/2014

Deep Captioning with Multimodal Recurrent Neural Networks (m-RNN)

In this paper, we present a multimodal Recurrent Neural Network (m-RNN) ...
research
02/06/2016

Strongly-Typed Recurrent Neural Networks

Recurrent neural networks are increasing popular models for sequential l...

Please sign up or login with your details

Forgot password? Click here to reset