Swipe dynamics as a means of authentication: results from a Bayesian unsupervised approach
The field of behavioural biometrics stands as an appealing alternative to more traditional biometric systems, due to the ease of use from a user perspective and the potential robustness to presentation attacks. Due to the nature of the characteristic features being modelled, a person's behaviour can be measured in a myriad of ways, growing with the evolution of embedded sensor technologies. This paper focuses its attention to a specific type of behavioural biometric utilising swipe dynamics, also often referred to as touch gestures. One characteristic of swipe authentication and new behavioural biometrics in general is the lack of available data to train and validate models, which makes unsupervised models particularly suited to the task. There is a strong usability requirement to be able to enrol a user with as few attempts as possible. From a machine learning perspective, this presents the classic curse of dimensionality problem, where one needs to train a model on a high dimensional feature space with only a few observations. The problem of modelling behavioural biometrics in this setting is discussed as one of learning probability distribution functions. This is viewed through the lens of Bayesian unsupervised models, which are well-suited to the low-data problem. This paper presents results from a set of experiments consisting of 38 sessions with labelled victim as well as blind and over-the-shoulder presentation attacks. Three models are compared using this dataset; two single-mode models: a shrunk covariance and a Bayesian Gaussian, as well as a Bayesian non-parametric infinite mixture of Gaussians, modelled as a Dirichlet Process (DP). Equal Error Rates (EER) for the three models are compared and attention is paid to how EER varies across the two single-mode models at low number of enrolment samples.
READ FULL TEXT