ADM for grid CRF loss in CNN segmentation

09/07/2018
by   Dmitrii Marin, et al.
0

Variants of gradient descent (GD) dominate CNN loss minimization in computer vision. But, as we show, some powerful loss functions are practically useless only due to their poor optimization by GD. In the context of weakly-supervised CNN segmentation, we present a general ADM approach to regularized losses, which are inspired by well-known MRF/CRF models in "shallow" segmentation. While GD fails on the popular nearest-neighbor Potts loss, ADM splitting with α-expansion solver significantly improves optimization of such grid CRF losses yielding state-of-the-art training quality. Denser CRF losses become amenable to basic GD, but they produce lower quality object boundaries in agreement with known noisy performance of dense CRF inference in shallow segmentation.

READ FULL TEXT
research
03/26/2018

On Regularized Losses for Weakly-supervised CNN Segmentation

Minimization of regularized losses is a principled approach to weak supe...
research
12/10/2014

Candidate Constrained CRFs for Loss-Aware Structured Prediction

When evaluating computer vision systems, we are often concerned with per...
research
06/11/2019

Gated CRF Loss for Weakly Supervised Semantic Image Segmentation

State-of-the-art approaches for semantic segmentation rely on deep convo...
research
05/06/2019

Simulating CRF with CNN for CNN

Combining CNN with CRF for modeling dependencies between pixel labels is...
research
04/05/2021

Robust Trust Region for Weakly Supervised Segmentation

Acquisition of training data for the standard semantic segmentation is e...
research
04/04/2018

Normalized Cut Loss for Weakly-supervised CNN Segmentation

Most recent semantic segmentation methods train deep convolutional neura...
research
08/03/2021

Dynamic Feature Regularized Loss for Weakly Supervised Semantic Segmentation

We focus on tackling weakly supervised semantic segmentation with scribb...

Please sign up or login with your details

Forgot password? Click here to reset