Embed Me If You Can: A Geometric Perceptron
Solving geometric tasks using machine learning is a challenging problem. Standard feed-forward neural networks combine linear or, if the bias parameter is included, affine layers and activation functions. Their geometric modeling is limited, which is why we introduce the alternative model of the multilayer geometric perceptron (MLGP) with units that are geometric neurons, i.e., combinations of hypersphere neurons. The hypersphere neuron is obtained by applying a conformal embedding of Euclidean space. By virtue of Clifford algebra, it can be implemented as the Cartesian dot product. We validate our method on the public 3D Tetris dataset consisting of coordinates of geometric shapes and we show that our method has the capability of generalization over geometric transformations. We demonstrate that our model is superior to the vanilla multilayer perceptron (MLP) while having fewer parameters and no activation function in the hidden layers other than the embedding. In the presence of noise in the data, our model is also superior to the multilayer hypersphere perceptron (MLHP) proposed in prior work. In contrast to the latter, our method reflects the 3D-geometry and provides a topological interpretation of the learned coefficients in the geometric neurons.
READ FULL TEXT