Novel transfer learning schemes based on Siamese networks and synthetic data

11/21/2022
by   Dominik Stallmann, et al.
0

Transfer learning schemes based on deep networks which have been trained on huge image corpora offer state-of-the-art technologies in computer vision. Here, supervised and semi-supervised approaches constitute efficient technologies which work well with comparably small data sets. Yet, such applications are currently restricted to application domains where suitable deepnetwork models are readily available. In this contribution, we address an important application area in the domain of biotechnology, the automatic analysis of CHO-K1 suspension growth in microfluidic single-cell cultivation, where data characteristics are very dissimilar to existing domains and trained deep networks cannot easily be adapted by classical transfer learning. We propose a novel transfer learning scheme which expands a recently introduced Twin-VAE architecture, which is trained on realistic and synthetic data, and we modify its specialized training procedure to the transfer learning domain. In the specific domain, often only few to no labels exist and annotations are costly. We investigate a novel transfer learning strategy, which incorporates a simultaneous retraining on natural and synthetic data using an invariant shared representation as well as suitable target variables, while it learns to handle unseen data from a different microscopy tech nology. We show the superiority of the variation of our Twin-VAE architecture over the state-of-the-art transfer learning methodology in image processing as well as classical image processing technologies, which persists, even with strongly shortened training times and leads to satisfactory results in this domain. The source code is available at https://github.com/dstallmann/transfer_learning_twinvae, works cross-platform, is open-source and free (MIT licensed) software. We make the data sets available at https://pub.uni-bielefeld.de/record/2960030.

READ FULL TEXT

page 3

page 5

page 6

page 7

page 8

research
04/24/2023

Distilling from Similar Tasks for Transfer Learning on a Budget

We address the challenge of getting efficient yet accurate recognition s...
research
10/20/2020

Towards an Automatic Analysis of CHO-K1 Suspension Growth in Microfluidic Single-cell Cultivation

Motivation: Innovative microfluidic systems carry the promise to greatly...
research
01/15/2020

Transfer learning for biomedical named entity recognition with neural networks.

Motivation The explosive increase of biomedical literature has made i...
research
06/18/2021

Adversarial Training Helps Transfer Learning via Better Representations

Transfer learning aims to leverage models pre-trained on source data to ...
research
05/18/2022

Global Contrast Masked Autoencoders Are Powerful Pathological Representation Learners

Based on digital whole slide scanning technique, artificial intelligence...
research
04/03/2020

Complete CVDL Methodology for Investigating Hydrodynamic Instabilities

In fluid dynamics, one of the most important research fields is hydrodyn...
research
04/30/2021

Determining Chess Game State From an Image

Identifying the configuration of chess pieces from an image of a chessbo...

Please sign up or login with your details

Forgot password? Click here to reset