Smaller3d: Smaller Models for 3D Semantic Segmentation Using Minkowski Engine and Knowledge Distillation Methods

05/04/2023
by   Alen Adamyan, et al.
0

There are various optimization techniques in the realm of 3D, including point cloud-based approaches that use mesh, texture, and voxels which optimize how you store, and how do calculate in 3D. These techniques employ methods such as feed-forward networks, 3D convolutions, graph neural networks, transformers, and sparse tensors. However, the field of 3D is one of the most computationally expensive fields, and these methods have yet to achieve their full potential due to their large capacity, complexity, and computation limits. This paper proposes the application of knowledge distillation techniques, especially for sparse tensors in 3D deep learning, to reduce model sizes while maintaining performance. We analyze and purpose different loss functions, including standard methods and combinations of various losses, to simulate the performance of state-of-the-art models of different Sparse Convolutional NNs. Our experiments are done on the standard ScanNet V2 dataset, and we achieved around 2.6% mIoU difference with a 4 times smaller model and around 8% with a 16 times smaller model on the latest state-of-the-art spacio-temporal convents based models.

READ FULL TEXT

page 3

page 6

research
08/26/2023

Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning

The use of large transformer-based models such as BERT, GPT, and T5 has ...
research
11/09/2020

Knowledge Distillation for Singing Voice Detection

Singing Voice Detection (SVD) has been an active area of research in mus...
research
04/22/2023

Knowledge Distillation from 3D to Bird's-Eye-View for LiDAR Semantic Segmentation

LiDAR point cloud segmentation is one of the most fundamental tasks for ...
research
10/25/2020

Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification

Real-world scenarios pose several challenges to deep learning based comp...
research
09/18/2023

Heterogeneous Generative Knowledge Distillation with Masked Image Modeling

Small CNN-based models usually require transferring knowledge from a lar...
research
06/29/2023

Streaming egocentric action anticipation: An evaluation scheme and approach

Egocentric action anticipation aims to predict the future actions the ca...
research
09/30/2022

Towards a Unified View of Affinity-Based Knowledge Distillation

Knowledge transfer between artificial neural networks has become an impo...

Please sign up or login with your details

Forgot password? Click here to reset