Deep discriminative to kernel generative modeling

by   Jayanta Dey, et al.

The fight between discriminative versus generative goes deep, in both the study of artificial and natural intelligence. In our view, both camps have complementary value, so, we sought to synergistic combine them. Here, we propose a methodology to convert deep discriminative networks to kernel generative networks. We leveraged the fact that deep models, including both random forests and deep networks, learn internal representations which are unions of polytopes with affine activation functions to conceptualize them both as generalized partitioning rules. From that perspective, we used foundational results on the relationship between histogram rules and kernel density estimators to obtain class conditional kernel density estimators from the deep models. We then studied the trade-offs we observed from implementing this strategy in low-dimensional settings, both theoretically and empirically, as a first step towards understanding. Theoretically, we show conditions under which our generative models are more efficient than the corresponding discriminative approaches. Empirically, when sample sizes are relatively high, the discriminative models tend to perform as well or better on discriminative metrics, such as classification rates and posterior calibration. However, when sample sizes are relatively low, the generative models outperform the discriminative ones even on discriminative metrics. Moreover, the generative ones can also sample from the distribution, obtain smoother posteriors, and extrapolate beyond the convex hull of the training data to handle OOD inputs more reasonably. Via human experiments we illustrate that our kernel generative networks (Kragen) behave more like humans than deep discriminative networks. We believe this approach may be an important step in unifying the thinking and approaches across the discriminative and generative divide.


page 7

page 10


Learning Discriminative Metrics via Generative Models and Kernel Learning

Metrics specifying distances between data points can be learned in a dis...

Generative and Discriminative Text Classification with Recurrent Neural Networks

We empirically characterize the performance of discriminative and genera...

Improving Generative Model-based Unfolding with Schrödinger Bridges

Machine learning-based unfolding has enabled unbinned and high-dimension...

Learning Generative Models using Denoising Density Estimators

Learning generative probabilistic models that can estimate the continuou...

Generative-Discriminative Complementary Learning

Majority of state-of-the-art deep learning methods for vision applicatio...

Forward Operator Estimation in Generative Models with Kernel Transfer Operators

Generative models which use explicit density modeling (e.g., variational...

Joints in Random Forests

Decision Trees (DTs) and Random Forests (RFs) are powerful discriminativ...

Please sign up or login with your details

Forgot password? Click here to reset