Differential Network Learning Beyond Data Samples

by   Arshdeep Sekhon, et al.

Learning the change of statistical dependencies between random variables is an essential task for many real-life applications, mostly in the high dimensional low sample regime. In this paper, we propose a novel differential parameter estimator that, in comparison to current methods, simultaneously allows (a) the flexible integration of multiple sources of information (data samples, variable groupings, extra pairwise evidence, etc.), (b) being scalable to a large number of variables, and (c) achieving a sharp asymptotic convergence rate. Our experiments, on more than 100 simulated and two real-world datasets, validate the flexibility of our approach and highlight the benefits of integrating spatial and anatomic information for brain connectome change discovery and epigenetic network identification.


page 1

page 2

page 3

page 4


Constrained Bayesian ICA for Brain Connectome Inference

Brain connectomics is a developing field in neurosciences which strives ...

Fast and Scalable Learning of Sparse Changes in High-Dimensional Gaussian Graphical Model Structure

We focus on the problem of estimating the change in the dependency struc...

Fast Generating A Large Number of Gumbel-Max Variables

The well-known Gumbel-Max Trick for sampling elements from a categorical...

Limit laws for the norms of extremal samples

Let denote S_n(p) = k_n^-1∑_i=1^k_n( log (X_n+1-i,n / X_n-k_n, n) )^p, w...

Detecting Abrupt Changes in Sequential Pairwise Comparison Data

The Bradley-Terry-Luce (BTL) model is a classic and very popular statist...

Towards Dynamic Causal Discovery with Rare Events: A Nonparametric Conditional Independence Test

Causal phenomena associated with rare events occur across a wide range o...

Fast Gumbel-Max Sketch and its Applications

The well-known Gumbel-Max Trick for sampling elements from a categorical...

Please sign up or login with your details

Forgot password? Click here to reset