Analysis of multiple data sequences with different distributions: defining common principal component axes by ergodic sequence generation and multiple reweighting composition

04/16/2021
by   Ikuo Fukuda, et al.
0

Principal component analysis (PCA) defines a reduced space described by PC axes for a given multidimensional-data sequence to capture the variations of the data. In practice, we need multiple data sequences that accurately obey individual probability distributions and for a fair comparison of the sequences we need PC axes that are common for the multiple sequences but properly capture these multiple distributions. For these requirements, we present individual ergodic samplings for these sequences and provide special reweighting for recovering the target distributions.

READ FULL TEXT
research
06/23/2023

Two derivations of Principal Component Analysis on datasets of distributions

In this brief note, we formulate Principal Component Analysis (PCA) over...
research
07/12/2022

Understanding High Dimensional Spaces through Visual Means Employing Multidimensional Projections

Data visualisation helps understanding data represented by multiple vari...
research
01/16/2013

Marginalization in Composed Probabilistic Models

Composition of low-dimensional distributions, whose foundations were lai...
research
07/10/2020

Neural Composition: Learning to Generate from Multiple Models

Decomposing models into multiple components is critically important in m...
research
04/21/2021

Principal Component Density Estimation for Scenario Generation Using Normalizing Flows

Neural networks-based learning of the distribution of non-dispatchable r...
research
08/08/2018

The roll call interpretation of the Shapley value

The Shapley value is commonly illustrated by roll call votes in which pl...
research
01/18/2022

Joint denoising and HDR for RAW video sequences

We propose a patch-based method for the simultaneous denoising and fusio...

Please sign up or login with your details

Forgot password? Click here to reset