Can we learn gradients by Hamiltonian Neural Networks?

10/31/2021
by   Aleksandr Timofeev, et al.
0

In this work, we propose a meta-learner based on ODE neural networks that learns gradients. This approach makes the optimizer is more flexible inducing an automatic inductive bias to the given task. Using the simplest Hamiltonian Neural Network we demonstrate that our method outperforms a meta-learner based on LSTM for an artificial task and the MNIST dataset with ReLU activations in the optimizee. Furthermore, it also surpasses the classic optimization methods for the artificial task and achieves comparable results for MNIST.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2021

Stateless Neural Meta-Learning using Second-Order Gradients

Deep learning typically requires large data sets and much compute power ...
research
06/15/2020

Neural Networks Fail to Learn Periodic Functions and How to Fix It

Previous literature offers limited clues on how to learn a periodic func...
research
07/11/2017

Meta-Learning with Temporal Convolutions

Deep neural networks excel in regimes with large amounts of data, but te...
research
04/27/2021

A unified framework for Hamiltonian deep neural networks

Training deep neural networks (DNNs) can be difficult due to the occurre...
research
02/09/2019

Meta-Curvature

We propose to learn curvature information for better generalization and ...
research
02/23/2021

Identifying Physical Law of Hamiltonian Systems via Meta-Learning

Hamiltonian mechanics is an effective tool to represent many physical pr...

Please sign up or login with your details

Forgot password? Click here to reset