Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification

10/25/2020
by   Ayush Bhardwaj, et al.
1

Real-world scenarios pose several challenges to deep learning based computer vision techniques despite their tremendous success in research. Deeper models provide better performance, but are challenging to deploy and knowledge distillation allows us to train smaller models with minimal loss in performance. The model also has to deal with open set samples from classes outside the ones it was trained on and should be able to identify them as unknown samples while classifying the known ones correctly. Finally, most existing image recognition research focuses only on using two-dimensional snapshots of the real world three-dimensional objects. In this work, we aim to bridge these three research fields, which have been developed independently until now, despite being deeply interrelated. We propose a joint Knowledge Distillation and Open Set recognition training methodology for three-dimensional object recognition. We demonstrate the effectiveness of the proposed method via various experiments on how it allows us to obtain a much smaller model, which takes a minimal hit in performance while being capable of open set recognition for 3D point cloud data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2021

Learning without Forgetting for 3D Point Cloud Objects

When we fine-tune a well-trained deep learning model for a new set of cl...
research
12/17/2022

3D Point Cloud Pre-training with Knowledge Distillation from 2D Images

The recent success of pre-trained 2D vision models is mostly attributabl...
research
07/26/2021

Text is Text, No Matter What: Unifying Text Recognition using Knowledge Distillation

Text recognition remains a fundamental and extensively researched topic ...
research
06/25/2023

Feature Adversarial Distillation for Point Cloud Classification

Due to the point cloud's irregular and unordered geometry structure, con...
research
05/04/2023

Smaller3d: Smaller Models for 3D Semantic Segmentation Using Minkowski Engine and Knowledge Distillation Methods

There are various optimization techniques in the realm of 3D, including ...
research
04/28/2023

Multi-to-Single Knowledge Distillation for Point Cloud Semantic Segmentation

3D point cloud semantic segmentation is one of the fundamental tasks for...
research
10/02/2020

Neighbourhood Distillation: On the benefits of non end-to-end distillation

End-to-end training with back propagation is the standard method for tra...

Please sign up or login with your details

Forgot password? Click here to reset