A Linearly Convergent Douglas-Rachford Splitting Solver for Markovian Information-Theoretic Optimization Problems

03/14/2022
by   Teng-Hui Huang, et al.
0

In this work, we propose solving the Information bottleneck (IB) and Privacy Funnel (PF) problems with Douglas-Rachford Splitting methods (DRS). We study a general Markovian information-theoretic Lagrangian that formulates the IB and PF problems into a convex-weakly convex pair of functions in a unified framework. Exploiting the recent non-convex convergence analysis for splitting methods, we prove the linear convergence of the proposed algorithms using the Kurdyka-Łojasiewicz inequality. Moreover, our analysis is beyond IB and PF and applies to any convex-weakly convex pair objectives. Based on the results, we develop two types of IB solvers, with one improves the performance of convergence over existing solvers while the other is linearly convergent independent to the relevance-compression trade-off, and a class of PF solvers that can handle both random and deterministic mappings. Empirically, we evaluate the proposed DRS solvers for both the IB and PF problems with our gradient-descent-based implementation. For IB, the proposed solvers result in solutions that are comparable to those obtained through the Blahut-Arimoto-based benchmark and is convergent for a wider range of the penalty coefficient than existing solvers. For PF, our non-greedy solvers can explore the information plane better than the clustering-based greedy solvers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2021

A Provably Convergent Information Bottleneck Solution via ADMM

The Information bottleneck (IB) method enables optimizing over the trade...
research
08/09/2015

A Linearly-Convergent Stochastic L-BFGS Algorithm

We propose a new stochastic L-BFGS algorithm and prove a linear converge...
research
05/03/2022

Smooth over-parameterized solvers for non-smooth structured optimization

Non-smooth optimization is a core ingredient of many imaging or machine ...
research
07/26/2019

Incremental Methods for Weakly Convex Optimization

We consider incremental algorithms for solving weakly convex optimizatio...
research
09/04/2020

Hybrid DCOP Solvers: Boosting Performance of Local Search Algorithms

We propose a novel method for expediting both symmetric and asymmetric D...
research
04/18/2014

iPiano: Inertial Proximal Algorithm for Non-Convex Optimization

In this paper we study an algorithm for solving a minimization problem c...
research
09/03/2019

Parameter Estimation in the Hermitian and Skew-Hermitian Splitting Method Using Gradient Iterations

This paper presents enhancement strategies for the Hermitian and skew-He...

Please sign up or login with your details

Forgot password? Click here to reset