Multi-level CNN for lung nodule classification with Gaussian Process assisted hyperparameter optimization

01/02/2019
by   Miao Zhang, et al.
0

This paper investigates lung nodule classification by using deep neural networks (DNNs). Hyperparameter optimization in DNNs is a computationally expensive problem, where evaluating a hyperparameter configuration may take several hours or even days. Bayesian optimization has been recently introduced for the automatically searching of optimal hyperparameter configurations of DNNs. It applies probabilistic surrogate models to approximate the validation error function of hyperparameter configurations, such as Gaussian processes, and reduce the computational complexity to a large extent. However, most existing surrogate models adopt stationary covariance functions to measure the difference between hyperparameter points based on spatial distance without considering its spatial locations. This distance-based assumption together with the condition of constant smoothness throughout the whole hyperparameter search space clearly violates the property that the points far away from optimal points usually get similarly poor performance even though each two of them have huge spatial distance between them. In this paper, a non-stationary kernel is proposed which allows the surrogate model to adapt to functions whose smoothness varies with the spatial location of inputs, and a multi-level convolutional neural network (ML-CNN) is built for lung nodule classification whose hyperparameter configuration is optimized by using the proposed non-stationary kernel based Gaussian surrogate model. Our algorithm searches the surrogate for optimal setting via hyperparameter importance based evolutionary strategy, and the experiments demonstrate our algorithm outperforms manual tuning and well-established hyperparameter optimization methods such as Random search, Gaussian processes with stationary kernels, and recently proposed Hyperparameter Optimization via RBF and Dynamic coordinate search.

READ FULL TEXT
research
07/28/2016

Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates

Automatically searching for optimal hyperparameter configurations is of ...
research
09/29/2022

Dynamic Surrogate Switching: Sample-Efficient Search for Factorization Machine Configurations in Online Recommendations

Hyperparameter optimization is the process of identifying the appropriat...
research
09/11/2018

Efficient Global Optimization using Deep Gaussian Processes

Efficient Global Optimization (EGO) is widely used for the optimization ...
research
02/25/2023

A Surrogate-Assisted Highly Cooperative Coevolutionary Algorithm for Hyperparameter Optimization in Deep Convolutional Neural Network

Convolutional neural networks (CNNs) have gained remarkable success in r...
research
07/31/2016

Hyperparameter Transfer Learning through Surrogate Alignment for Efficient Deep Neural Network Training

Recently, several optimization methods have been successfully applied to...
research
02/20/2022

Dynamic and Efficient Gray-Box Hyperparameter Optimization for Deep Learning

Gray-box hyperparameter optimization techniques have recently emerged as...
research
11/30/2022

Learning non-stationary and discontinuous functions using clustering, classification and Gaussian process modelling

Surrogate models have shown to be an extremely efficient aid in solving ...

Please sign up or login with your details

Forgot password? Click here to reset