On a convergence property of a geometrical algorithm for statistical manifolds

09/27/2019
by   Shotaro Akaho, et al.
27

In this paper, we examine a geometrical projection algorithm for statistical inference. The algorithm is based on Pythagorean relation and it is derivative-free as well as representation-free that is useful in nonparametric cases. We derive a bound of learning rate to guarantee local convergence. In special cases of m-mixture and e-mixture estimation problems, we calculate specific forms of the bound that can be used easily in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2022

An optimal scheduled learning rate for a randomized Kaczmarz algorithm

We study how the learning rate affects the performance of a relaxed rand...
research
04/17/2014

Geometric Inference for General High-Dimensional Linear Inverse Problems

This paper presents a unified geometric framework for the statistical an...
research
04/15/2021

Polynomial methods in statistical inference: theory and practice

This survey provides an exposition of a suite of techniques based on the...
research
11/03/2022

Statistical Inference for Scale Mixture Models via Mellin Transform Approach

This paper deals with statistical inference for the scale mixture models...
research
08/12/2016

Student's t Distribution based Estimation of Distribution Algorithms for Derivative-free Global Optimization

In this paper, we are concerned with a branch of evolutionary algorithms...
research
06/28/2019

Large-scale inference with block structure

The detection of weak and rare effects in large amounts of data arises i...
research
10/21/2022

Clip-Tuning: Towards Derivative-free Prompt Learning with a Mixture of Rewards

Derivative-free prompt learning has emerged as a lightweight alternative...

Please sign up or login with your details

Forgot password? Click here to reset