PyGlove: Symbolic Programming for Automated Machine Learning

01/21/2021
by   Daiyi Peng, et al.
30

Neural networks are sensitive to hyper-parameter and architecture choices. Automated Machine Learning (AutoML) is a promising paradigm for automating these choices. Current ML software libraries, however, are quite limited in handling the dynamic interactions among the components of AutoML. For example, efficientNAS algorithms, such as ENAS and DARTS, typically require an implementation coupling between the search space and search algorithm, the two key components in AutoML. Furthermore, implementing a complex search flow, such as searching architectures within a loop of searching hardware configurations, is difficult. To summarize, changing the search space, search algorithm, or search flow in current ML libraries usually requires a significant change in the program logic. In this paper, we introduce a new way of programming AutoML based on symbolic programming. Under this paradigm, ML programs are mutable, thus can be manipulated easily by another program. As a result, AutoML can be reformulated as an automated process of symbolic manipulation. With this formulation, we decouple the triangle of the search algorithm, the search space and the child program. This decoupling makes it easy to change the search space and search algorithm (without and with weight sharing), as well as to add search capabilities to existing code and implement complex search flows. We then introduce PyGlove, a new Python library that implements this paradigm. Through case studies on ImageNet and NAS-Bench-101, we show that with PyGlove users can easily convert a static program into a search space, quickly iterate on the search spaces and search algorithms, and craft complex search flows to achieve better results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2020

To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

Weight-sharing (WS) has recently emerged as a paradigm to accelerate the...
research
06/05/2020

AutoHAS: Differentiable Hyper-parameter and Architecture Search

Neural Architecture Search (NAS) has achieved significant progress in pu...
research
05/26/2022

Tensor Program Optimization with Probabilistic Programs

Automatic optimization for tensor programs becomes increasingly importan...
research
12/05/2021

Exploring Complicated Search Spaces with Interleaving-Free Sampling

The existing neural architecture search algorithms are mostly working on...
research
11/02/2018

Value-based Search in Execution Space for Mapping Instructions to Programs

Training models to map natural language instructions to programs given t...
research
11/27/2021

AutoTSC: Optimization Algorithm to Automatically Solve the Time Series Classification Problem

Nowadays Automated Machine Learning, abbrevi- ated AutoML, is recognize...
research
08/02/2017

Analyzing Neural MT Search and Model Performance

In this paper, we offer an in-depth analysis about the modeling and sear...

Please sign up or login with your details

Forgot password? Click here to reset