AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data

10/30/2020
by   Romain Egele, et al.
0

Developing high-performing predictive models for large tabular data sets is a challenging task. The state-of-the-art methods are based on expert-developed model ensembles from different supervised learning methods. Recently, automated machine learning (AutoML) is emerging as a promising approach to automate predictive model development. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural network architectures concurrently and improves the accuracy of the generated models iteratively. A key issue in NAS, particularly for large data sets, is the large computation time required to evaluate each generated architecture. While data-parallel training is a promising approach that can address this issue, its use within NAS is difficult. For different data sets, the data-parallel training settings such as the number of parallel processes, learning rate, and batch size need to be adapted to achieve high accuracy and reduction in training time. To that end, we have developed AgEBO-Tabular, an approach to combine aging evolution (AgE), a parallel NAS method that searches over neural architecture space, and an asynchronous Bayesian optimization method for tuning the hyperparameters of the data-parallel training simultaneously. We demonstrate the efficacy of the proposed method to generate high-performing neural network models for large tabular benchmark data sets. Furthermore, we demonstrate that the automatically discovered neural network models using our method outperform the state-of-the-art AutoML ensemble models in inference speed by two orders of magnitude while reaching similar accuracy values.

READ FULL TEXT
research
08/27/2020

Graph Neural Network Architecture Search for Molecular Property Prediction

Predicting the properties of a molecule from its structure is a challeng...
research
10/19/2021

NAS-HPO-Bench-II: A Benchmark Dataset on Joint Optimization of Convolutional Neural Network Architecture and Training Hyperparameters

The benchmark datasets for neural architecture search (NAS) have been de...
research
07/08/2021

Core-set Sampling for Efficient Neural Architecture Search

Neural architecture search (NAS), an important branch of automatic machi...
research
09/06/2019

Distributed creation of Machine learning agents for Blockchain analysis

Creating efficient deep neural networks involves repetitive manual optim...
research
12/17/2019

Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training Data

This paper investigates the intriguing question of whether we can create...
research
04/20/2023

Focus on the Challenges: Analysis of a User-friendly Data Search Approach with CLIP in the Automotive Domain

Handling large amounts of data has become a key for developing automated...
research
08/26/2020

NASirt: AutoML based learning with instance-level complexity information

Designing adequate and precise neural architectures is a challenging tas...

Please sign up or login with your details

Forgot password? Click here to reset