Beyond Flatland: Pre-training with a Strong 3D Inductive Bias

11/30/2021
by   Shubhaankar Gupta, et al.
0

Pre-training on large-scale databases consisting of natural images and then fine-tuning them to fit the application at hand, or transfer-learning, is a popular strategy in computer vision. However, Kataoka et al., 2020 introduced a technique to eliminate the need for natural images in supervised deep learning by proposing a novel synthetic, formula-based method to generate 2D fractals as training corpus. Using one synthetically generated fractal for each class, they achieved transfer learning results comparable to models pre-trained on natural images. In this project, we take inspiration from their work and build on this idea – using 3D procedural object renders. Since the image formation process in the natural world is based on its 3D structure, we expect pre-training with 3D mesh renders to provide an implicit bias leading to better generalization capabilities in a transfer learning setting and that invariances to 3D rotation and illumination are easier to be learned based on 3D data. Similar to the previous work, our training corpus will be fully synthetic and derived from simple procedural strategies; we will go beyond classic data augmentation and also vary illumination and pose which are controllable in our setting and study their effect on transfer learning capabilities in context to prior work. In addition, we will compare the 2D fractal and 3D procedural object networks to human and non-human primate brain data to learn more about the 2D vs. 3D nature of biological vision.

READ FULL TEXT

page 3

page 4

research
03/24/2021

Can Vision Transformers Learn without Natural Images?

Can we complete pre-training of Vision Transformers (ViT) without natura...
research
01/21/2021

Pre-training without Natural Images

Is it possible to use convolutional neural networks pre-trained without ...
research
07/27/2023

Pre-training Vision Transformers with Very Limited Synthesized Images

Formula-driven supervised learning (FDSL) is a pre-training method that ...
research
11/16/2018

Domain Adaptive Transfer Learning with Specialist Models

Transfer learning is a widely used method to build high performing compu...
research
05/31/2021

Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images

Transfer learning aims to exploit pre-trained models for more efficient ...
research
06/04/2019

Transfer Learning with intelligent training data selection for prediction of Alzheimer's Disease

Detection of Alzheimer's Disease (AD) from neuroimaging data such as MRI...
research
04/24/2020

How Much Off-The-Shelf Knowledge Is Transferable From Natural Images To Pathology Images?

Deep learning has achieved a great success in natural image classificati...

Please sign up or login with your details

Forgot password? Click here to reset