Logical Activation Functions: Logit-space equivalents of Boolean Operators

10/22/2021
by   Scott C. Lowe, et al.
5

Neuronal representations within artificial neural networks are commonly understood as logits, representing the log-odds score of presence (versus absence) of features within the stimulus. Under this interpretation, we can derive the probability P(x_0 x_1) that a pair of independent features are both present in the stimulus from their logits. By converting the resulting probability back into a logit, we obtain a logit-space equivalent of the AND operation. However, since this function involves taking multiple exponents and logarithms, it is not well suited to be directly used within neural networks. We thus constructed an efficient approximation named AND_AIL (the AND operator Approximate for Independent Logits) utilizing only comparison and addition operations, which can be deployed as an activation function in neural networks. Like MaxOut, AND_AIL is a generalization of ReLU to two-dimensions. Additionally, we constructed efficient approximations of the logit-space equivalents to the OR and XNOR operators. We deployed these new activation functions, both in isolation and in conjunction, and demonstrated their effectiveness on a variety of tasks including image classification, transfer learning, abstract reasoning, and compositional zero-shot learning.

READ FULL TEXT

page 4

page 15

page 16

page 17

page 18

research
07/21/2020

Activation function dependence of the storage capacity of treelike neural networks

The expressive power of artificial neural networks crucially depends on ...
research
01/09/2019

Is it Time to Swish? Comparing Deep Learning Activation Functions Across NLP tasks

Activation functions play a crucial role in neural networks because they...
research
05/15/2020

A New Activation Function for Training Deep Neural Networks to Avoid Local Minimum

Activation functions have a major role to play and hence are very import...
research
05/26/2019

ProbAct: A Probabilistic Activation Function for Deep Neural Networks

Activation functions play an important role in the training of artificia...
research
01/27/2021

Kähler Geometry of Quiver Varieties and Machine Learning

We develop an algebro-geometric formulation for neural networks in machi...
research
03/16/2022

Adaptive n-ary Activation Functions for Probabilistic Boolean Logic

Balancing model complexity against the information contained in observed...

Please sign up or login with your details

Forgot password? Click here to reset