Deep Virtual Networks for Memory Efficient Inference of Multiple Tasks

04/09/2019
by   Eunwoo Kim, et al.
0

Deep networks consume a large amount of memory by their nature. A natural question arises can we reduce that memory requirement whilst maintaining performance. In particular, in this work we address the problem of memory efficient learning for multiple tasks. To this end, we propose a novel network architecture producing multiple networks of different configurations, termed deep virtual networks (DVNs), for different tasks. Each DVN is specialized for a single task and structured hierarchically. The hierarchical structure, which contains multiple levels of hierarchy corresponding to different numbers of parameters, enables multiple inference for different memory budgets. The building block of a deep virtual network is based on a disjoint collection of parameters of a network, which we call a unit. The lowest level of hierarchy in a deep virtual network is a unit, and higher levels of hierarchy contain lower levels' units and other additional units. Given a budget on the number of parameters, a different level of a deep virtual network can be chosen to perform the task. A unit can be shared by different DVNs, allowing multiple DVNs in a single network. In addition, shared units provide assistance to the target task with additional knowledge learned from another tasks. This cooperative configuration of DVNs makes it possible to handle different tasks in a memory-aware manner. Our experiments show that the proposed method outperforms existing approaches for multiple tasks. Notably, ours is more efficient than others as it allows memory-aware inference for all tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2017

Learning Nested Sparse Structures in Deep Neural Networks

Recently, there have been increasing demands to construct compact deep a...
research
08/28/2023

Online Continual Learning on Hierarchical Label Expansion

Continual learning (CL) enables models to adapt to new tasks and environ...
research
05/19/2020

The Virtual Block Interface: A Flexible Alternative to the Conventional Virtual Memory Framework

Computers continue to diversify with respect to system designs, emerging...
research
09/11/2019

Deep Elastic Networks with Model Selection for Multi-Task Learning

In this work, we consider the problem of instance-wise dynamic network m...
research
07/22/2017

Optimizations of Management Algorithms for Multi-Level Memory Hierarchy

In the near future the SCM is predicted to modify the form of new progra...
research
05/15/2019

Budget-Aware Adapters for Multi-Domain Learning

Multi-Domain Learning (MDL) refers to the problem of learning a set of m...
research
03/10/2020

Devil is Virtual: Reversing Virtual Inheritance in C++ Binaries

Complexities that arise from implementation of object-oriented concepts ...

Please sign up or login with your details

Forgot password? Click here to reset