-
Langevin Dynamics for Inverse Reinforcement Learning of Stochastic Gradient Algorithms
Inverse reinforcement learning (IRL) aims to estimate the reward functio...
read it
-
Variance reduction for distributed stochastic gradient MCMC
Stochastic gradient MCMC methods, such as stochastic gradient Langevin d...
read it
-
SVAG: Unified Convergence Results for SAG-SAGA Interpolation with Stochastic Variance Adjusted Gradient Descent
We analyze SVAG, a variance reduced stochastic gradient method with SAG ...
read it
-
A Universally Optimal Multistage Accelerated Stochastic Gradient Method
We study the problem of minimizing a strongly convex and smooth function...
read it
-
Efficient Implementation of Second-Order Stochastic Approximation Algorithms in High-Dimensional Problems
Stochastic approximation (SA) algorithms have been widely applied in min...
read it
-
On the rates of convergence of Parallelized Averaged Stochastic Gradient Algorithms
The growing interest for high dimensional and functional data analysis l...
read it
-
A fast and recursive algorithm for clustering large datasets with k-medians
Clustering with fast algorithms large samples of high dimensional data i...
read it
Multi-kernel Passive Stochastic Gradient Algorithms
This paper develops a novel passive stochastic gradient algorithm. In passive stochastic approximation, the stochastic gradient algorithm does not have control over the location where noisy gradients of the cost function are evaluated. Classical passive stochastic gradient algorithms use a kernel that approximates a Dirac delta to weigh the gradients based on how far they are evaluated from the desired point. In this paper we construct a multi-kernel passive stochastic gradient algorithm. The algorithm performs substantially better in high dimensional problems and incorporates variance reduction. We analyze the weak convergence of the multi-kernel algorithm and its rate of convergence. In numerical examples, we study the multi-kernel version of the LMS algorithm to compare the performance with the classical passive version.
READ FULL TEXT
Comments
There are no comments yet.