Adaptive Structural Hyper-Parameter Configuration by Q-Learning

03/02/2020
by   Haotian Zhang, et al.
0

Tuning hyper-parameters for evolutionary algorithms is an important issue in computational intelligence. Performance of an evolutionary algorithm depends not only on its operation strategy design, but also on its hyper-parameters. Hyper-parameters can be categorized in two dimensions as structural/numerical and time-invariant/time-variant. Particularly, structural hyper-parameters in existing studies are usually tuned in advance for time-invariant parameters, or with hand-crafted scheduling for time-invariant parameters. In this paper, we make the first attempt to model the tuning of structural hyper-parameters as a reinforcement learning problem, and present to tune the structural hyper-parameter which controls computational resource allocation in the CEC 2018 winner algorithm by Q-learning. Experimental results show favorably against the winner algorithm on the CEC 2018 test functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2020

On Hyper-parameter Tuning for Stochastic Optimization Algorithms

This paper proposes the first-ever algorithmic framework for tuning hype...
research
05/12/2020

Unified Framework for the Adaptive Operator Selection of Discrete Parameters

We conduct an exhaustive survey of adaptive selection of operators (AOS)...
research
11/09/2022

Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning

Choosing the values of hyper-parameters in sparse Bayesian learning (SBL...
research
06/13/2020

Online Hyper-parameter Tuning in Off-policy Learning via Evolutionary Strategies

Off-policy learning algorithms have been known to be sensitive to the ch...
research
09/23/2012

Making a Science of Model Search

Many computer vision algorithms depend on a variety of parameter choices...
research
05/09/2023

Reducing the Cost of Cycle-Time Tuning for Real-World Policy Optimization

Continuous-time reinforcement learning tasks commonly use discrete steps...
research
12/27/2021

Automatic Configuration for Optimal Communication Scheduling in DNN Training

ByteScheduler partitions and rearranges tensor transmissions to improve ...

Please sign up or login with your details

Forgot password? Click here to reset