Continual learning on 3D point clouds with random compressed rehearsal

by   Maciej Zamorski, et al.

Contemporary deep neural networks offer state-of-the-art results when applied to visual reasoning, e.g., in the context of 3D point cloud data. Point clouds are important datatype for precise modeling of three-dimensional environments, but effective processing of this type of data proves to be challenging. In the world of large, heavily-parameterized network architectures and continuously-streamed data, there is an increasing need for machine learning models that can be trained on additional data. Unfortunately, currently available models cannot fully leverage training on additional data without losing their past knowledge. Combating this phenomenon, called catastrophic forgetting, is one of the main objectives of continual learning. Continual learning for deep neural networks has been an active field of research, primarily in 2D computer vision, natural language processing, reinforcement learning, and robotics. However, in 3D computer vision, there are hardly any continual learning solutions specifically designed to take advantage of point cloud structure. This work proposes a novel neural network architecture capable of continual learning on 3D point cloud data. We utilize point cloud structure properties for preserving a heavily compressed set of past data. By using rehearsal and reconstruction as regularization methods of the learning process, our approach achieves a significant decrease of catastrophic forgetting compared to the existing solutions on several most popular point cloud datasets considering two continual learning settings: when a task is known beforehand, and in the challenging scenario of when task information is unknown to the model.


page 1

page 2

page 3

page 4


Continual Learning for Pose-Agnostic Object Recognition in 3D Point Clouds

Continual Learning aims to learn multiple incoming new tasks continually...

Continual Learning for LiDAR Semantic Segmentation: Class-Incremental and Coarse-to-Fine strategies on Sparse Data

During the last few years, continual learning (CL) strategies for image ...

Selective Amnesia: On Efficient, High-Fidelity and Blind Suppression of Backdoor Effects in Trojaned Machine Learning Models

In this paper, we present a simple yet surprisingly effective technique ...

Learnware: Small Models Do Big

There are complaints about current machine learning techniques such as t...

Explore In-Context Learning for 3D Point Cloud Understanding

With the rise of large-scale models trained on broad data, in-context le...

Continual Learning by Asymmetric Loss Approximation with Single-Side Overestimation

Catastrophic forgetting is a critical challenge in training deep neural ...

Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments

A key challenge for AI is to build embodied systems that operate in dyna...

Please sign up or login with your details

Forgot password? Click here to reset