DeepAI AI Chat
Log In Sign Up

Born-Infeld (BI) for AI: Energy-Conserving Descent (ECD) for Optimization

by   G. Bruno De Luca, et al.
Stanford University

We introduce a novel framework for optimization based on energy-conserving Hamiltonian dynamics in a strongly mixing (chaotic) regime and establish its key properties analytically and numerically. The prototype is a discretization of Born-Infeld dynamics, with a squared relativistic speed limit depending on the objective function. This class of frictionless, energy-conserving optimizers proceeds unobstructed until slowing naturally near the minimal loss, which dominates the phase space volume of the system. Building from studies of chaotic systems such as dynamical billiards, we formulate a specific algorithm with good performance on machine learning and PDE-solving tasks, including generalization. It cannot stop at a high local minimum and cannot overshoot the global minimum, yielding an advantage in non-convex loss functions, and proceeds faster than GD+momentum in shallow valleys.


page 1

page 2

page 3

page 4


A Dynamical View on Optimization Algorithms of Overparameterized Neural Networks

When equipped with efficient optimization algorithms, the over-parameter...

Non-convex shape optimization by dissipative Hamiltonian flows

Shape optimization with constraints given by partial differential equati...

Replica Exchange for Non-Convex Optimization

Gradient descent (GD) is known to converge quickly for convex objective ...

Hamiltonian Descent Methods

We propose a family of optimization methods that achieve linear converge...

A Novel Framework for Policy Mirror Descent with General Parametrization and Linear Convergence

Modern policy optimization methods in applied reinforcement learning, su...

Optimization via conformal Hamiltonian systems on manifolds

In this work we propose a method to perform optimization on manifolds. W...

Code Repositories