Improving task-specific representation via 1M unlabelled images without any extra knowledge

06/24/2020
by   Aayush Bansal, et al.
9

We present a case-study to improve the task-specific representation by leveraging a million unlabelled images without any extra knowledge. We propose an exceedingly simple method of conditioning an existing representation on a diverse data distribution and observe that a model trained on diverse examples acts as a better initialization. We extensively study our findings for the task of surface normal estimation and semantic segmentation from a single image. We improve surface normal estimation on NYU-v2 depth dataset and semantic segmentation on PASCAL VOC by 4 task-specific knowledge or auxiliary tasks, neither changed hyper-parameters nor made any modification in the underlying neural network architecture.

READ FULL TEXT

page 1

page 4

research
05/17/2021

Learning to Relate Depth and Semantics for Unsupervised Domain Adaptation

We present an approach for encoding visual task relationships to improve...
research
06/08/2019

Pattern-Affinitive Propagation across Depth, Surface Normal and Semantic Segmentation

In this paper, we propose a novel Pattern-Affinitive Propagation (PAP) f...
research
04/26/2019

Representation Similarity Analysis for Efficient Task taxonomy & Transfer Learning

Transfer learning is widely used in deep neural network models when ther...
research
04/03/2023

Knowledge Accumulation in Continually Learned Representations and the Issue of Feature Forgetting

By default, neural networks learn on all training data at once. When suc...
research
02/21/2017

PixelNet: Representation of the pixels, by the pixels, and for the pixels

We explore design principles for general pixel-level prediction problems...
research
01/26/2023

Learning Good Features to Transfer Across Tasks and Domains

Availability of labelled data is the major obstacle to the deployment of...
research
10/17/2021

Quantifying the Task-Specific Information in Text-Based Classifications

Recently, neural natural language models have attained state-of-the-art ...

Please sign up or login with your details

Forgot password? Click here to reset