Parallel Weight Consolidation: A Brain Segmentation Case Study

05/28/2018
by   Patrick McClure, et al.
0

Collecting the large datasets needed to train deep neural networks can be very difficult, particularly for the many applications for which sharing and pooling data is complicated by practical, ethical, or legal concerns. However, it may be the case that derivative datasets or predictive models developed within individual sites can be shared and combined with fewer restrictions. Training on distributed datasets and combining the resulting networks is often viewed as continual learning, but these methods require networks to be trained sequentially. In this paper, we introduce parallel weight consolidation (PWC), a continual learning method to consolidate the weights of neural networks trained in parallel on independent datasets. We perform a brain segmentation case study using PWC to consolidate several dilated convolutional neural networks trained in parallel on independent structural magnetic resonance imaging (sMRI) datasets from different sites. We found that PWC led to increased performance on held-out test sets from the different sites, as well as on a very large and completely independent multi-site dataset. This demonstrates the feasibility of PWC for combining the knowledge learned by networks trained on different datasets.

READ FULL TEXT

page 6

page 7

research
11/06/2018

Towards continual learning in medical imaging

This work investigates continual learning of two segmentation tasks in b...
research
07/19/2021

Adversarial Continual Learning for Multi-Domain Hippocampal Segmentation

Deep learning for medical imaging suffers from temporal and privacy-rela...
research
06/02/2023

Overcoming the Stability Gap in Continual Learning

In many real-world applications, deep neural networks are retrained from...
research
03/11/2019

Distributed deep learning for robust multi-site segmentation of CT imaging after traumatic brain injury

Machine learning models are becoming commonplace in the domain of medica...
research
10/09/2020

Continual learning using hash-routed convolutional neural networks

Continual learning could shift the machine learning paradigm from data c...
research
06/07/2016

Active Long Term Memory Networks

Continual Learning in artificial neural networks suffers from interferen...

Please sign up or login with your details

Forgot password? Click here to reset