DeepAI AI Chat
Log In Sign Up

The Incremental Proximal Method: A Probabilistic Perspective

by   Ömer Deniz Akyıldız, et al.
Universidad Carlos III de Madrid

In this work, we highlight a connection between the incremental proximal method and stochastic filters. We begin by showing that the proximal operators coincide, and hence can be realized with, Bayes updates. We give the explicit form of the updates for the linear regression problem and show that there is a one-to-one correspondence between the proximal operator of the least-squares regression and the Bayes update when the prior and the likelihood are Gaussian. We then carry out this observation to a general sequential setting: We consider the incremental proximal method, which is an algorithm for large-scale optimization, and show that, for a linear-quadratic cost function, it can naturally be realized by the Kalman filter. We then discuss the implications of this idea for nonlinear optimization problems where proximal operators are in general not realizable. In such settings, we argue that the extended Kalman filter can provide a systematic way for the derivation of practical procedures.


page 1

page 2

page 3

page 4


A probabilistic incremental proximal gradient method

In this paper, we propose a probabilistic optimization method, named pro...

On-Line Learning of Linear Dynamical Systems: Exponential Forgetting in Kalman Filters

Kalman filter is a key tool for time-series forecasting and analysis. We...

Local Kernels that Approximate Bayesian Regularization and Proximal Operators

In this work, we broadly connect kernel-based filtering (e.g. approaches...

Efficient implementation of incremental proximal-point methods

Model training algorithms which observe a small portion of the training ...

Estimation Procedures for Robust Sensor Control

Many robotic sensor estimation problems can characterized in terms of no...

The Extended Parameter Filter

The parameters of temporal models, such as dynamic Bayesian networks, ma...