Unsupervised Domain Adaptation for 3D Keypoint Prediction from a Single Depth Scan

12/15/2017
by   Xingyi Zhou, et al.
0

In this paper, we introduce a novel unsupervised domain adaptation technique for the task of 3D keypoint prediction from a single depth scan/image. Our key idea is to utilize the fact that predictions from different views of the same or similar objects should be consistent with each other. Such view consistency provides effective regularization for keypoint prediction on unlabeled instances. In addition, we introduce a geometric alignment term to regularize predictions in the target domain. The resulting loss function can be effectively optimized via alternating minimization. We demonstrate the effectiveness of our approach on real datasets and present experimental results showing that our approach is superior to state-of-the-art general-purpose domain adaptation techniques.

READ FULL TEXT

page 1

page 8

research
08/04/2020

Shape Consistent 2D Keypoint Estimation under Domain Shift

Recent unsupervised domain adaptation methods based on deep architecture...
research
09/14/2020

Unsupervised Domain Adaptation by Uncertain Feature Alignment

Unsupervised domain adaptation (UDA) deals with the adaptation of models...
research
09/09/2019

Unsupervised Domain Adaptation for Depth Prediction from Images

State-of-the-art approaches to infer dense depth measurements from image...
research
03/07/2019

Unsupervised Domain Adaptation using Feature-Whitening and Consensus Loss

A classifier trained on a dataset seldom works on other datasets obtaine...
research
06/11/2019

SALT: Subspace Alignment as an Auxiliary Learning Task for Domain Adaptation

Unsupervised domain adaptation aims to transfer and adapt knowledge lear...
research
11/17/2021

See Eye to Eye: A Lidar-Agnostic 3D Detection Framework for Unsupervised Multi-Target Domain Adaptation

Sampling discrepancies between different manufacturers and models of lid...

Please sign up or login with your details

Forgot password? Click here to reset