Multi-task pre-training of deep neural networks for digital pathology

05/05/2020
by   Romain Mormont, et al.
0

In this work, we investigate multi-task learning as a way of pre-training models for classification tasks in digital pathology. It is motivated by the fact that many small and medium-size datasets have been released by the community over the years whereas there is no large scale dataset similar to ImageNet in the domain. We first assemble and transform many digital pathology datasets into a pool of 22 classification tasks and almost 900k images. Then, we propose a simple architecture and training scheme for creating a transferable model and a robust evaluation and selection protocol in order to evaluate our method. Depending on the target task, we show that our models used as feature extractors either improve significantly over ImageNet pre-trained models or provide comparable performance. Fine-tuning improves performance over feature extraction and is able to recover the lack of specificity of ImageNet features, as both pre-training sources yield comparable performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2020

Multi-task pre-training of deep neural networks

In this work, we investigate multi-task learning as a way of pre-trainin...
research
01/26/2021

Muppet: Massive Multi-task Representations with Pre-Finetuning

We propose pre-finetuning, an additional large-scale learning stage betw...
research
10/06/2021

Improving Fractal Pre-training

The deep neural networks used in modern computer vision systems require ...
research
01/03/2022

Improving Feature Extraction from Histopathological Images Through A Fine-tuning ImageNet Model

Due to lack of annotated pathological images, transfer learning has been...
research
12/24/2019

Large Scale Learning of General Visual Representations for Transfer

Transfer of pre-trained representations improves sample efficiency and s...
research
07/01/2019

Pentagon at MEDIQA 2019: Multi-task Learning for Filtering and Re-ranking Answers using Language Inference and Question Entailment

Parallel deep learning architectures like fine-tuned BERT and MT-DNN, ha...
research
05/02/2023

Discovering the Effectiveness of Pre-Training in a Large-scale Car-sharing Platform

Recent progress of deep learning has empowered various intelligent trans...

Please sign up or login with your details

Forgot password? Click here to reset