Statistical Analysis of Multi-Relational Network Recovery

08/29/2020
by   Zhi Wang, et al.
0

In this paper, we develop asymptotic theories for a class of latent variable models for large-scale multi-relational networks. In particular, we establish consistency results and asymptotic error bounds for the (penalized) maximum likelihood estimators when the size of the network tends to infinity. The basic technique is to develop a non-asymptotic error bound for the maximum likelihood estimators through large deviations analysis of random fields. We also show that these estimators are nearly optimal in terms of minimax risk.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2022

On minimax density estimation via measure transport

We study the convergence properties, in Hellinger and related distances,...
research
12/29/2017

Finite-sample risk bounds for maximum likelihood estimation with arbitrary penalties

The MDL two-part coding index of resolvability provides a finite-sampl...
research
06/05/2023

A unified analysis of likelihood-based estimators in the Plackett–Luce model

The Plackett–Luce model is a popular approach for ranking data analysis,...
research
05/22/2020

Asymptotic accuracy of the saddlepoint approximation for maximum likelihood estimation

The saddlepoint approximation gives an approximation to the density of a...
research
01/11/2018

Exact Calculation of Normalized Maximum Likelihood Code Length Using Fourier Analysis

The normalized maximum likelihood code length has been widely used in mo...
research
03/30/2021

A Tensor-EM Method for Large-Scale Latent Class Analysis with Clustering Consistency

Latent class models are powerful statistical modeling tools widely used ...
research
07/27/2023

Learning cross-layer dependence structure in multilayer networks

Multilayer networks are a network data structure in which elements in a ...

Please sign up or login with your details

Forgot password? Click here to reset