SHAPE: Shifted Absolute Position Embedding for Transformers

09/13/2021
by   Shun Kiyono, et al.
8

Position representation is crucial for building position-aware representations in Transformers. Existing position representations suffer from a lack of generalization to test data with unseen lengths or high computational cost. We investigate shifted absolute position embedding (SHAPE) to address both issues. The basic idea of SHAPE is to achieve shift invariance, which is a key property of recent successful position representations, by randomly shifting absolute positions during training. We demonstrate that SHAPE is empirically comparable to its counterpart while being simpler and faster.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset