Neural Set Function Extensions: Learning with Discrete Functions in High Dimensions

08/08/2022
by   Nikolaos Karalias, et al.
0

Integrating functions on discrete domains into neural networks is key to developing their capability to reason about discrete objects. But, discrete domains are (1) not naturally amenable to gradient-based optimization, and (2) incompatible with deep learning architectures that rely on representations in high-dimensional vector spaces. In this work, we address both difficulties for set functions, which capture many important discrete problems. First, we develop a framework for extending set functions onto low-dimensional continuous domains, where many extensions are naturally defined. Our framework subsumes many well-known extensions as special cases. Second, to avoid undesirable low-dimensional neural network bottlenecks, we convert low-dimensional extensions into representations in high-dimensional spaces, taking inspiration from the success of semidefinite programs for combinatorial optimization. Empirically, we observe benefits of our extensions for unsupervised neural combinatorial optimization, in particular with high-dimensional representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2020

Extracting low-dimensional psychological representations from convolutional neural networks

Deep neural networks are increasingly being used in cognitive modeling a...
research
06/02/2019

Dimensionality compression and expansion in Deep Neural Networks

Datasets such as images, text, or movies are embedded in high-dimensiona...
research
02/14/2023

Scalable Bayesian optimization with high-dimensional outputs using randomized prior networks

Several fundamental problems in science and engineering consist of globa...
research
10/24/2022

Precision Machine Learning

We explore unique considerations involved in fitting ML models to data w...
research
08/27/2018

Gradient-based Training of Slow Feature Analysis by Differentiable Approximate Whitening

This paper proposes Power Slow Feature Analysis, a gradient-based method...
research
04/25/2020

Learning to Guide Random Search

We are interested in derivative-free optimization of high-dimensional fu...
research
10/31/2017

Deep Learning as a Mixed Convex-Combinatorial Optimization Problem

As neural networks grow deeper and wider, learning networks with hard-th...

Please sign up or login with your details

Forgot password? Click here to reset