
Deep PDF: Probabilistic Surface Optimization and Density Estimation
A probability density function (pdf) encodes the entire stochastic knowl...
read it

Label Smoothed Embedding Hypothesis for OutofDistribution Detection
Detecting outofdistribution (OOD) examples is critical in many applica...
read it

TensorTrain Density Estimation
Estimation of probability density function from samples is one of the ce...
read it

Manifold Density Estimation via Generalized Dequantization
Density estimation is an important technique for characterizing distribu...
read it

Multivariate Density Estimation with Deep Neural Mixture Models
Albeit worryingly underrated in the recent literature on machine learnin...
read it

SymmetryAware Marginal Density Estimation
The RaoBlackwell theorem is utilized to analyze and improve the scalabi...
read it

On the Power Spectral Density Applied to the Analysis of Old Canvases
A routine task for art historians is painting diagnostics, such as datin...
read it
General Probabilistic Surface Optimization and Log Density Estimation
In this paper we contribute a novel algorithm family, which generalizes many unsupervised techniques including unnormalized and energy models, and allows to infer different statistical modalities (e.g. data likelihood and ratio between densities) from data samples. The proposed unsupervised technique Probabilistic Surface Optimization (PSO) views a neural network (NN) as a flexible surface which can be pushed according to lossspecific virtual stochastic forces, where a dynamical equilibrium is achieved when the pointwise forces on the surface become equal. Concretely, the surface is pushed up and down at points sampled from two different distributions, with overall up and down forces becoming functions of these two distribution densities and of force intensity magnitudes defined by loss of a particular PSO instance. The eventual force equilibrium upon convergence enforces the NN to be equal to various statistical functions depending on the used magnitude functions, such as data density. Furthermore, this dynamicalstatistical equilibrium is extremely intuitive and useful, providing many implications and possible usages in probabilistic inference. Further, we provide new PSObased approaches as demonstration of PSO exceptional usability. We also analyze PSO convergence and optimization stability, and relate them to the gradient similarity function over NN input space. Further, we propose new ways to improve the above stability. Finally, we present new instances of PSO, termed PSOLDE, for data density estimation on logarithmic scale and also provide a new NN blockdiagonal architecture for increased surface flexibility, which significantly improves estimation accuracy. Both PSOLDE and the new architecture are combined together as a new density estimation technique. In our experiments we demonstrate this technique to produce highly accurate density estimation for 20D data.
READ FULL TEXT
Comments
There are no comments yet.