Generalized Stable Weights via Neural Gibbs Density

11/14/2022
by   Yoshiaki Kitazawa, et al.
0

We present a generalized balancing weight method fully available for estimating causal effects for an arbitrary mixture of discrete and continuous interventions. Our weights are trainable through back-propagation, and we give a method for estimating the weights via neural network algorithms. In addition, we also provide a method to measure the performance of our weights by estimating the mutual information for the balanced distribution. Our method is easy to implement with any present deep learning libraries, and the weights from it can be used in most state-of-art supervised algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2021

Estimating Conditional Mutual Information for Discrete-Continuous Mixtures using Multi-Dimensional Adaptive Histograms

Estimating conditional mutual information (CMI) is an essential yet chal...
research
11/17/2020

On Integer Balancing of Digraphs

A weighted digraph is balanced if the sums of the weights of the incomin...
research
07/15/2021

Independence weights for causal inference with continuous exposures

Studying causal effects of continuous exposures is important for gaining...
research
10/26/2019

Kernel Optimal Orthogonality Weighting: A Balancing Approach to Estimating Effects of Continuous Treatments

Many scientific questions require estimating the effects of continuous t...
research
03/30/2021

Continuous Weight Balancing

We propose a simple method by which to choose sample weights for problem...
research
02/27/2021

Spline parameterization of neural network controls for deep learning

Based on the continuous interpretation of deep learning cast as an optim...
research
12/06/2018

Distributed Weight Balancing in Directed Topologies

This doctoral thesis concerns novel distributed algorithms for weight ba...

Please sign up or login with your details

Forgot password? Click here to reset