Multi-task hypernetworks

02/27/2019
by   Sylwester Klocek, et al.
0

Hypernetworks mechanism allows to generate and train neural networks (target networks) with use of other neural network (hypernetwork). In this paper, we extend this idea and show that hypernetworks are able to generate target networks, which can be customized to serve different purposes. In particular, we apply this mechanism to create a continuous functional representation of images. Namely, the hypernetwork takes an image and at test time produces weights to a target network, which approximates its RGB pixel intensities. Due to the continuity of representation, we may look at the image at different scales or fill missing regions. Second, we demonstrate how to design a hypernetwork, which produces a generative model for a new data set at test time. Experimental results demonstrate that the proposed mechanism can be successfully used in super-resolution and 2D object modeling.

READ FULL TEXT

page 1

page 2

page 5

page 6

research
11/30/2018

Super-Resolution based on Image-Adapted CNN Denoisers: Incorporating Generalization of Training Data and Internal Learning in Test Time

While deep neural networks exhibit state-of-the-art results in the task ...
research
04/08/2020

Image super-resolution reconstruction based on attention mechanism and feature fusion

Aiming at the problems that the convolutional neural networks neglect to...
research
12/30/2019

Self-Supervised Fine-tuning for Image Enhancement of Super-Resolution Deep Neural Networks

While Deep Neural Networks (DNNs) trained for image and video super-reso...
research
06/14/2019

Fixing the train-test resolution discrepancy

Data-augmentation is key to the training of neural networks for image cl...
research
10/23/2022

Single Image Super-Resolution via a Dual Interactive Implicit Neural Network

In this paper, we introduce a novel implicit neural network for the task...
research
03/05/2023

Super-Resolution Neural Operator

We propose Super-resolution Neural Operator (SRNO), a deep operator lear...
research
02/11/2022

The Dual Form of Neural Networks Revisited: Connecting Test Time Predictions to Training Patterns via Spotlights of Attention

Linear layers in neural networks (NNs) trained by gradient descent can b...

Please sign up or login with your details

Forgot password? Click here to reset