A Bayesian Approach for De-duplication in the Presence of Relational Data

09/14/2019
by   Juan Sosa, et al.
0

In this paper we study the impact of combining profile and network data in a de-duplication setting. We also assess the influence of a range of prior distributions on the linkage structure, including our proposal. Our proposed prior makes it straightforward to specify prior believes and naturally enforces the microclustering property. Furthermore, we explore stochastic gradient Hamiltonian Monte Carlo methods as a faster alternative to obtain samples for the network parameters. Our methodology is evaluated using the RLdata500 data, which is a popular dataset in the record linkage literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2020

Decentralized Stochastic Gradient Langevin Dynamics and Hamiltonian Monte Carlo

Stochastic gradient Langevin dynamics (SGLD) and stochastic gradient Ham...
research
08/14/2018

A Record Linkage Model Incorporating Relational Data

In this paper we introduce a novel Bayesian approach for linking multipl...
research
02/17/2014

Stochastic Gradient Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for d...
research
12/04/2018

Parallel-tempered Stochastic Gradient Hamiltonian Monte Carlo for Approximate Multimodal Posterior Sampling

We propose a new sampler that integrates the protocol of parallel temper...
research
12/04/2019

Quantum-Inspired Hamiltonian Monte Carlo for Bayesian Sampling

Hamiltonian Monte Carlo (HMC) is an efficient Bayesian sampling method t...
research
11/03/2022

Log-density gradient covariance and automatic metric tensors for Riemann manifold Monte Carlo methods

A metric tensor for Riemann manifold Monte Carlo particularly suited for...
research
09/21/2021

Flexible and efficient Bayesian pharmacometrics modeling using Stan and Torsten, Part I

Stan is an open-source probabilistic programing language, primarily desi...

Please sign up or login with your details

Forgot password? Click here to reset