End-to-end learning potentials for structured attribute prediction

08/06/2017
by   Kota Yamaguchi, et al.
0

We present a structured inference approach in deep neural networks for multiple attribute prediction. In attribute prediction, a common approach is to learn independent classifiers on top of a good feature representation. However, such classifiers assume conditional independence on features and do not explicitly consider the dependency between attributes in the inference process. We propose to formulate attribute prediction in terms of marginal inference in the conditional random field. We model potential functions by deep neural networks and apply the sum-product algorithm to solve for the approximate marginal distribution in feed-forward networks. Our message passing layer implements sparse pairwise potentials by a softplus-linear function that is equivalent to a higher-order classifier, and learns all the model parameters by end-to-end back propagation. The experimental results using SUN attributes and CelebA datasets suggest that the structured inference improves the attribute prediction performance, and possibly uncovers the hidden relationship between attributes.

READ FULL TEXT
research
06/06/2021

Graph Belief Propagation Networks

With the wide-spread availability of complex relational data, semi-super...
research
06/06/2015

Deeply Learning the Messages in Message Passing Inference

Deep structured output learning shows great promise in tasks like semant...
research
07/19/2017

Deep View-Sensitive Pedestrian Attribute Inference in an end-to-end Model

Pedestrian attribute inference is a demanding problem in visual surveill...
research
02/12/2018

SparseMAP: Differentiable Sparse Structured Inference

Structured prediction requires searching over a combinatorial number of ...
research
03/09/2017

End-to-end semantic face segmentation with conditional random fields as convolutional, recurrent and adversarial networks

Recent years have seen a sharp increase in the number of related yet dis...
research
08/02/2023

Factor Graph Neural Networks

In recent years, we have witnessed a surge of Graph Neural Networks (GNN...
research
02/08/2017

Backpropagation Training for Fisher Vectors within Neural Networks

Fisher-Vectors (FV) encode higher-order statistics of a set of multiple ...

Please sign up or login with your details

Forgot password? Click here to reset