Meta-Neighborhoods

09/18/2019
by   Siyuan Shan, et al.
0

Traditional methods for training neural networks use training data just once, as it is discarded after training. Instead, in this work we also leverage the training data during testing to adjust the network and gain more expressivity. Our approach, named Meta-Neighborhoods, is developed under a multi-task learning framework and is a generalization of k-nearest neighbors methods. It can flexibly adapt network parameters w.r.t. different query data using their respective local neighborhood information. Local information is learned and stored in a dictionary of learnable neighbors rather than directly retrieved from the training set for greater flexibility and performance. The network parameters and the dictionary are optimized end-to-end via meta-learning. Extensive experiments demonstrate that Meta-Neighborhoods consistently improved classification and regression performance across various network architectures and datasets. We also observed superior improvements than other state-of-the-art meta-learning methods designed to improve supervised learning.

READ FULL TEXT
research
02/03/2020

Revisiting Meta-Learning as Supervised Learning

Recent years have witnessed an abundance of new publications and approac...
research
01/27/2023

Meta Temporal Point Processes

A temporal point process (TPP) is a stochastic process where its realiza...
research
08/08/2020

Meta Feature Modulator for Long-tailed Recognition

Deep neural networks often degrade significantly when training data suff...
research
04/10/2023

Meta Compositional Referring Expression Segmentation

Referring expression segmentation aims to segment an object described by...
research
06/08/2021

Meta-Learning to Compositionally Generalize

Natural language is compositional; the meaning of a sentence is a functi...
research
11/16/2021

Online Meta Adaptation for Variable-Rate Learned Image Compression

This work addresses two major issues of end-to-end learned image compres...
research
05/22/2023

Improved Compositional Generalization by Generating Demonstrations for Meta-Learning

Meta-learning and few-shot prompting are viable methods to induce certai...

Please sign up or login with your details

Forgot password? Click here to reset