Hyperbolic Attention Networks

by   Caglar Gulcehre, et al.

We introduce hyperbolic attention networks to endow neural networks with enough capacity to match the complexity of data with hierarchical and power-law structure. A few recent approaches have successfully demonstrated the benefits of imposing hyperbolic geometry on the parameters of shallow networks. We extend this line of work by imposing hyperbolic geometry on the activations of neural networks. This allows us to exploit hyperbolic geometry to reason about embeddings produced by deep networks. We achieve this by re-expressing the ubiquitous mechanism of soft attention in terms of operations defined for hyperboloid and Klein models. Our method shows improvements in terms of generalization on neural machine translation, learning on graphs and visual question answering tasks while keeping the neural representations compact.


page 2

page 15


Hyperbolic Neural Networks

Hyperbolic spaces have recently gained momentum in the context of machin...

Hyperbolic Convolution via Kernel Point Aggregation

Learning representations according to the underlying geometry is of vita...

Hyperbolic Deep Neural Networks: A Survey

Recently, there has been a raising surge of momentum for deep representa...

The signature and cusp geometry of hyperbolic knots

We introduce a new real-valued invariant called the natural slope of a h...

Fully Hyperbolic Neural Networks

Hyperbolic neural networks have shown great potential for modeling compl...

Hyperbolic Neural Networks for Molecular Generation

With the recent advance of deep learning, neural networks have been exte...

Coneheads: Hierarchy Aware Attention

Attention networks such as transformers have achieved state-of-the-art p...

Please sign up or login with your details

Forgot password? Click here to reset