Learning Symbolic Expressions via Gumbel-Max Equation Learner Network

12/12/2020
by   Gang Chen, et al.
0

Although modern machine learning, in particular deep learning, has achieved outstanding success in scientific and engineering research, most of the neural networks (NNs) learned via these state-of-the-art techniques are black-box models. For a widespread success of machine learning in science and engineering, it is important to develop new NN architectures to effectively extract high-level mathematical knowledge from complex dataset. To meet this research demand, this paper focuses on the symbolic regression problem and develops a new NN architecture called the Gumbel-Max Equation Learner (GMEQL) network. Different from previously proposed Equation Learner (EQL) networks, GMEQL applies continuous relaxation to the network structure via the Gumbel-Max trick and introduces two types of trainable parameters: structure parameters and regression parameters. This paper also proposes a new two-stage training process and new techniques to train structure parameters in both the online and offline settings based on an elite repository. On 8 benchmark symbolic regression problems, GMEQL is experimentally shown to outperform several cutting-edge techniques for symbolic regression.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset