Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning

11/09/2022
by   Dawei Gao, et al.
0

Choosing the values of hyper-parameters in sparse Bayesian learning (SBL) can significantly impact performance. However, the hyper-parameters are normally tuned manually, which is often a difficult task. Most recently, effective automatic hyper-parameter tuning was achieved by using an empirical auto-tuner. In this work, we address the issue of hyper-parameter auto-tuning using neural network (NN)-based learning. Inspired by the empirical auto-tuner, we design and learn a NN-based auto-tuner, and show that considerable improvement in convergence rate and recovery performance can be achieved.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2019

Deep Neural Network Hyperparameter Optimization with Orthogonal Array Tuning

Deep learning algorithms have achieved excellent performance lately in a...
research
06/28/2019

Mise en abyme with artificial intelligence: how to predict the accuracy of NN, applied to hyper-parameter tuning

In the context of deep learning, the costliest phase from a computationa...
research
05/09/2023

Reducing the Cost of Cycle-Time Tuning for Real-World Policy Optimization

Continuous-time reinforcement learning tasks commonly use discrete steps...
research
03/02/2020

Adaptive Structural Hyper-Parameter Configuration by Q-Learning

Tuning hyper-parameters for evolutionary algorithms is an important issu...
research
04/13/2021

Bayesian Optimisation for a Biologically Inspired Population Neural Network

We have used Bayesian Optimisation (BO) to find hyper-parameters in an e...
research
11/23/2021

Using mixup as regularization and tuning hyper-parameters for ResNets

While novel computer vision architectures are gaining traction, the impa...
research
05/11/2022

Hierarchical Collaborative Hyper-parameter Tuning

Hyper-parameter Tuning is among the most critical stages in building mac...

Please sign up or login with your details

Forgot password? Click here to reset