Differentiable Frank-Wolfe Optimization Layer

08/21/2023
by   Zixuan Liu, et al.
0

Differentiable optimization has received a significant amount of attention due to its foundational role in the domain of machine learning based on neural networks. The existing methods leverages the optimality conditions and implicit function theorem to obtain the Jacobian matrix of the output, which increases the computational cost and limits the application of differentiable optimization. In addition, some non-differentiable constraints lead to more challenges when using prior differentiable optimization layers. This paper proposes a differentiable layer, named Differentiable Frank-Wolfe Layer (DFWLayer), by rolling out the Frank-Wolfe method, a well-known optimization algorithm which can solve constrained optimization problems without projections and Hessian matrix computations, thus leading to a efficient way of dealing with large-scale problems. Theoretically, we establish a bound on the suboptimality gap of the DFWLayer in the context of l1-norm constraints. Experimental assessments demonstrate that the DFWLayer not only attains competitive accuracy in solutions and gradients but also consistently adheres to constraints. Moreover, it surpasses the baselines in both forward and backward computational speeds.

READ FULL TEXT
research
06/26/2023

PMaF: Deep Declarative Layers for Principal Matrix Features

We explore two differentiable deep declarative layers, namely least squa...
research
03/01/2017

OptNet: Differentiable Optimization as a Layer in Neural Networks

This paper presents OptNet, a network architecture that integrates optim...
research
02/07/2020

Differentiable Fixed-Point Iteration Layer

Recently, several studies proposed methods to utilize some restricted cl...
research
06/16/2021

Amortized Synthesis of Constrained Configurations Using a Differentiable Surrogate

In design, fabrication, and control problems, we are often faced with th...
research
11/21/2021

Differentiable Projection for Constrained Deep Learning

Deep neural networks (DNNs) have achieved extraordinary performance in s...
research
12/14/2021

Efficient differentiable quadratic programming layers: an ADMM approach

Recent advances in neural-network architecture allow for seamless integr...
research
02/13/2022

Beyond NaN: Resiliency of Optimization Layers in The Face of Infeasibility

Prior work has successfully incorporated optimization layers as the last...

Please sign up or login with your details

Forgot password? Click here to reset