Hyperparameter Tuning in Echo State Networks

07/16/2022
by   Filip Matzner, et al.
0

Echo State Networks represent a type of recurrent neural network with a large randomly generated reservoir and a small number of readout connections trained via linear regression. The most common topology of the reservoir is a fully connected network of up to thousands of neurons. Over the years, researchers have introduced a variety of alternative reservoir topologies, such as a circular network or a linear path of connections. When comparing the performance of different topologies or other architectural changes, it is necessary to tune the hyperparameters for each of the topologies separately since their properties may significantly differ. The hyperparameter tuning is usually carried out manually by selecting the best performing set of parameters from a sparse grid of predefined combinations. Unfortunately, this approach may lead to underperforming configurations, especially for sensitive topologies. We propose an alternative approach of hyperparameter tuning based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Using this approach, we have improved multiple topology comparison results by orders of magnitude suggesting that topology alone does not play as important role as properly tuned hyperparameters.

READ FULL TEXT
research
09/24/2019

Reservoir Topology in Deep Echo State Networks

Deep Echo State Networks (DeepESNs) recently extended the applicability ...
research
11/30/2018

How to Organize your Deep Reinforcement Learning Agents: The Importance of Communication Topology

In this empirical paper, we investigate how learning agents can be arran...
research
06/04/2020

Sparsity in Reservoir Computing Neural Networks

Reservoir Computing (RC) is a well-known strategy for designing Recurren...
research
01/23/2020

Discovering the IPv6 Network Periphery

We consider the problem of discovering the IPv6 network periphery, i.e.,...
research
06/18/2017

Sparse Neural Networks Topologies

We propose Sparse Neural Network architectures that are based on random ...
research
02/04/2021

Forecasting Using Reservoir Computing: The Role of Generalized Synchronization

Reservoir computers (RC) are a form of recurrent neural network (RNN) us...
research
06/14/2020

The Statistical Cost of Robust Kernel Hyperparameter Tuning

This paper studies the statistical complexity of kernel hyperparameter t...

Please sign up or login with your details

Forgot password? Click here to reset