Probabilistic Conformal Prediction Using Conditional Random Samples

06/14/2022
by   Zhendong Wang, et al.
0

This paper proposes probabilistic conformal prediction (PCP), a predictive inference algorithm that estimates a target variable by a discontinuous predictive set. Given inputs, PCP construct the predictive set based on random samples from an estimated generative model. It is efficient and compatible with either explicit or implicit conditional generative models. Theoretically, we show that PCP guarantees correct marginal coverage with finite samples. Empirically, we study PCP on a variety of simulated and real datasets. Compared to existing methods for conformal inference, PCP provides sharper predictive sets.

READ FULL TEXT
research
03/12/2019

The limits of distribution-free conditional predictive inference

We consider the problem of distribution-free predictive inference, with ...
research
11/23/2021

Sensitivity Analysis of Individual Treatment Effects: A Robust Conformal Inference Approach

We propose a model-free framework for sensitivity analysis of individual...
research
03/22/2023

Adaptive Conformal Prediction by Reweighting Nonconformity Score

Despite attractive theoretical guarantees and practical successes, Predi...
research
07/31/2019

Conditional independence testing: a predictive perspective

Conditional independence testing is a key problem required by many machi...
research
05/19/2021

Latent Gaussian Model Boosting

Latent Gaussian models and boosting are widely used techniques in statis...
research
11/02/2017

A Universal Marginalizer for Amortized Inference in Generative Models

We consider the problem of inference in a causal generative model where ...
research
04/11/2018

CoT: Cooperative Training for Generative Modeling

We propose Cooperative Training (CoT) for training generative models tha...

Please sign up or login with your details

Forgot password? Click here to reset