Sparse Choice Models

11/19/2010
by   Vivek F. Farias, et al.
0

Choice models, which capture popular preferences over objects of interest, play a key role in making decisions whose eventual outcome is impacted by human choice behavior. In most scenarios, the choice model, which can effectively be viewed as a distribution over permutations, must be learned from observed data. The observed data, in turn, may frequently be viewed as (partial, noisy) information about marginals of this distribution over permutations. As such, the search for an appropriate choice model boils down to learning a distribution over permutations that is (near-)consistent with observed information about this distribution. In this work, we pursue a non-parametric approach which seeks to learn a choice model (i.e. a distribution over permutations) with sparsest possible support, and consistent with observed data. We assume that the data observed consists of noisy information pertaining to the marginals of the choice model we seek to learn. We establish that any choice model admits a `very' sparse approximation in the sense that there exists a choice model whose support is small relative to the dimension of the observed data and whose marginals approximately agree with the observed marginal information. We further show that under, what we dub, `signature' conditions, such a sparse approximation can be found in a computationally efficiently fashion relative to a brute force approach. An empirical study using the American Psychological Association election data-set suggests that our approach manages to unearth useful structural properties of the underlying choice model using the sparse approximation found. Our results further suggest that the signature condition is a potential alternative to the recently popularized Restricted Null Space condition for efficient recovery of sparse models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2022

Active Learning for Non-Parametric Choice Models

We study the problem of actively learning a non-parametric choice model ...
research
08/30/2022

Catalytic Priors: Using Synthetic Data to Specify Prior Distributions in Bayesian Analysis

Catalytic prior distributions provide general, easy-to-use and interpret...
research
06/19/2020

Distortion estimates for approximate Bayesian inference

Current literature on posterior approximation for Bayesian inference off...
research
08/12/2022

The Limit of the Marginal Distribution Model in Consumer Choice

Given data on choices made by consumers for different assortments, a key...
research
05/23/2016

On Optimality Conditions for Auto-Encoder Signal Recovery

Auto-Encoders are unsupervised models that aim to learn patterns from ob...
research
08/25/2015

Clustering With Side Information: From a Probabilistic Model to a Deterministic Algorithm

In this paper, we propose a model-based clustering method (TVClust) that...
research
03/20/2012

A Novel Training Algorithm for HMMs with Partial and Noisy Access to the States

This paper proposes a new estimation algorithm for the parameters of an ...

Please sign up or login with your details

Forgot password? Click here to reset