Log In Sign Up

Collaborative Method for Incremental Learning on Classification and Generation

by   Byungju Kim, et al.

Although well-trained deep neural networks have shown remarkable performance on numerous tasks, they rapidly forget what they have learned as soon as they begin to learn with additional data with the previous data stop being provided. In this paper, we introduce a novel algorithm, Incremental Class Learning with Attribute Sharing (ICLAS), for incremental class learning with deep neural networks. As one of its component, we also introduce a generative model, incGAN, which can generate images with increased variety compared with the training data. Under challenging environment of data deficiency, ICLAS incrementally trains classification and the generation networks. Since ICLAS trains both networks, our algorithm can perform multiple times of incremental class learning. The experiments on MNIST dataset demonstrate the advantages of our algorithm.


Transfer Learning with Sparse Associative Memories

In this paper, we introduce a novel layer designed to be used as the out...

Bayesian Incremental Learning for Deep Neural Networks

In industrial machine learning pipelines, data often arrive in parts. Pa...

Incremental Sequence Learning

Deep learning research over the past years has shown that by increasing ...

Learning Representations from Temporally Smooth Data

Events in the real world are correlated across nearby points in time, an...

Incremental Deep Neural Network Learning using Classification Confidence Thresholding

Most modern neural networks for classification fail to take into account...

Extending Pretrained Segmentation Networks with Additional Anatomical Structures

Comprehensive surgical planning require complex patient-specific anatomi...