Learngene: Inheriting Condensed Knowledge from the Ancestry Model to Descendant Models

05/03/2023
by   Qiufeng Wang, et al.
0

During the continuous evolution of one organism's ancestry, its genes accumulate extensive experiences and knowledge, enabling newborn descendants to rapidly adapt to their specific environments. Motivated by this observation, we propose a novel machine learning paradigm Learngene to enable learning models to incorporate three key characteristics of genes. (i) Accumulating: the knowledge is accumulated during the continuous learning of an ancestry model. (ii) Condensing: the exhaustive accumulated knowledge is condensed into a much more compact information piece, learngene. (iii): Inheriting: the condensed learngene is inherited to make it easier for descendant models to adapt to new environments. Since accumulating has been studied in some well-developed paradigms like large-scale pre-training and lifelong learning, we focus on condensing and inheriting, which induces three key issues and we provide the preliminary solutions to these issues in this paper: (i) Learngene Form: the learngene is set to a few integral layers that can preserve the most commonality. (ii) Learngene Condensing: we identify which layers among the ancestry model have the most similarity as one pseudo descendant model. (iii) Learngene Inheriting: to construct distinct descendant models for specific downstream tasks, we stack some randomly initialized layers to the learngene layers. Extensive experiments of various settings, including using different network architectures like Vision Transformer (ViT) and Convolutional Neural Networks (CNNs) on different datasets, are carried out to confirm five advantages and two characteristics of Learngene.

READ FULL TEXT

page 3

page 10

page 13

page 17

research
10/17/2022

Contrastive Language-Image Pre-Training with Knowledge Graphs

Recent years have witnessed the fast development of large-scale pre-trai...
research
02/16/2023

GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks

Graphs can model complex relationships between objects, enabling a myria...
research
07/10/2023

One-Shot Pruning for Fast-adapting Pre-trained Models on Devices

Large-scale pre-trained models have been remarkably successful in resolv...
research
08/03/2022

GPPF: A General Perception Pre-training Framework via Sparsely Activated Multi-Task Learning

Pre-training over mixtured multi-task, multi-domain, and multi-modal dat...
research
06/09/2022

Neural Prompt Search

The size of vision models has grown exponentially over the last few year...
research
06/18/2018

Towards an efficient deep learning model for musical onset detection

In this paper, we propose an efficient and reproducible deep learning mo...

Please sign up or login with your details

Forgot password? Click here to reset