On Deep Set Learning and the Choice of Aggregations

03/18/2019
by   Maximilian Soelch, et al.
4

Recently, it has been shown that many functions on sets can be represented by sum decompositions. These decompositons easily lend themselves to neural approximations, extending the applicability of neural nets to set-valued inputs---Deep Set learning. This work investigates a core component of Deep Set architecture: aggregation functions. We suggest and examine alternatives to commonly used aggregation functions, including learnable recurrent aggregation functions. Empirically, we show that the Deep Set networks are highly sensitive to the choice of aggregation functions: beyond improved performance, we find that learnable aggregations lower hyper-parameter sensitivity and generalize better to out-of-distribution input size.

READ FULL TEXT
research
12/15/2020

Learning Aggregation Functions

Learning on sets is increasingly gaining attention in the machine learni...
research
12/16/2022

Learnable Commutative Monoids for Graph Neural Networks

Graph neural networks (GNNs) have been shown to be highly sensitive to t...
research
07/06/2021

Representing choice functions by a total hyper-order

Choice functions over a set X that satisfy the Outcast, a.k.a. Aizerman,...
research
01/29/2019

Learning Choice Functions

We study the problem of learning choice functions, which play an importa...
research
06/24/2023

Generalised f-Mean Aggregation for Graph Neural Networks

Graph Neural Network (GNN) architectures are defined by their implementa...
research
06/18/2020

Tensor Decompositions in Recursive NeuralNetworks for Tree-Structured Data

The paper introduces two new aggregation functions to encode structural ...
research
12/08/2009

Hyper-sparse optimal aggregation

In this paper, we consider the problem of "hyper-sparse aggregation". Na...

Please sign up or login with your details

Forgot password? Click here to reset