Emotion Recognition From Gait Analyses: Current Research and Future Directions

03/13/2020 ∙ by Shihao Xu, et al. ∙ IEEE Uppsala universitet NetEase, Inc 0

Human gait refers to a daily motion that represents not only mobility, but it can also be used to identify the walker by either human observers or computers. Recent studies reveal that gait even conveys information about the walker's emotion. Individuals in different emotion states may show different gait patterns. The mapping between various emotions and gait patterns provides a new source for automated emotion recognition. Compared to traditional emotion detection biometrics, such as facial expression, speech and physiological parameters, gait is remotely observable, more difficult to imitate, and requires less cooperation from the subject. These advantages make gait a promising source for emotion detection. This article reviews current research on gait-based emotion detection, particularly on how gait parameters can be affected by different emotion states and how the emotion states can be recognized through distinct gait patterns. We focus on the detailed methods and techniques applied in the whole process of emotion recognition: data collection, preprocessing, and classification. At last, we discuss possible future developments of efficient and effective gait-based emotion recognition using the state of the art techniques on intelligent computation and big data.



There are no comments yet.


This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Human gait is a manner of walking of individuals. It describes a common but important daily motion through which observers can learn much useful information about the walker. In clinical terms, apart from the detection of movement abnormalities, observation of gait patterns also provides diagnostic clues for multiple neurological disorders such as cerebral palsy, Parkinson’s disease, and Rett syndrome [22, 32, 30] in an early stage. Clinical gait analysis therefore plays a more and more important role in medical care which may prevent patients from permanent damage. It becomes a well developed tool for quantitative assessment of gait disturbance, which can be applied to functional diagnosis, treatment planning and monitoring of disease progression [3]. In addition, gait provides useful social knowledge to the observers. Research has revealed that human observers are able to recognize themselves and other people that they are familiar with even from the point-light depicted or impoverished gait patterns [15, 45]

, indicating that gait is unique. The identity information is embedded in the gait signature, which has been considered as a unique biometric identification tool. With the advacements of computer vision and big data analysis, gait recognition has been widely employed for various security applications

[33, 8].

Furthermore, it was suggested that emotion expression is embedded in the body languages including gait and postural features [10, 50, 14, 78]. Indeed, people in different emotional states show distinct gait kinematics [26]

. For instance, studies found that depressed individuals show different gait patterns, including slower gaits, smaller stride size, shorter double limb support, and cycle duration, in contrast to the control group

[48]. According to the previous studies, human observers are able to identify emotions based on the gait [50]. These findings indicate that gait can be considered as a potential informative source for emotion perception and recognition.

Human emotion is heavily involved in cognitive process and social interaction. In recent years, automated emotion recognition becomes a growing field along with the development of computing techniques. It supports numerous applications in inter-personal and human-computer interaction such as customer services, interactive gaming and e-learning, etc.

However, current research on emotion recognition mainly focused on facial expression [66, 21], speech (including linguistic and acoustic features)[20] and physiological signals (e.g. electroencephalography, electromyography, heart rate, blood volume pulse, etc.) [37], while relatively few studies investigate the association between emotion and full-body movements. Gait analysis shows apparent advantages over the prevailing modalities, which makes it a promising tool for emotion recognition. We summarize these advantages as follows.

  • Human bodies are relatively large and have multiple degrees of freedom. Unlike facial expression data which must be collected by cameras in very short distance. For monitoring gait patterns, the subjects are allowed to be relatively far from the cameras. The practical distance can be up to several meters while most other biometrics are no longer observable or provide very low resolution

    [27, 77].

  • Walking usually requires less consciousness and intention from the walker. Thereby it is not vulnerable to active manipulation and imitation. Ekman [18] has pointed out that the head expresses the nature of emotions while the body shows information about the emotional intensity.

  • The data collection process of gait patterns requires less cooperation from the subject, so that his or her behaviour is less interfered and closer to the normal state in real life.

With a goal of exploring the opportunities and facilitating the development of this emerging field, we systematically review the literatures related to emotion detection based on gait. A recent survey [2] discussed how emotion affects gait for patients with Parkinson disease. Different from that survey, our study focuses on a generic target group with physically and mentally healthy individuals in this work. Another survey by Stephens-Fripp et al. [70] focused on emotion detection based on gait and posture, but it discussed gait and posture together and emphasized more on the posture part whereas our current paper is all around gait and shares the details of how gait can be affected by emotions and the process of gait-based emotion detection. Moreover, we also introduce emotion models in current emotion theory and dynamics of gait. It is not only informative, but also essential for computer scientists to integrate theories into their design of the automated emotion recognition system since this field is highly interdisciplinary.

This survey is organized as follows: Section II and III give some background on different models of emotion, gait initiation and gait cycle. Section IV discusses the impact of emotion on the gait initiation and the gait cycle. Section V presents the details of gait based emotion recognition including data collection, preprocessing, and classification. In Section VI, we propose some future directions on gait based emotion recognition with machine learning and data analysis. Finally, Section VII gives the conclusion on this topic.

Ii Models of Emotion

Human emotion is a diverse complex. It can be reflected by a clear happy representation like laughter or smile to extreme sadness with tears. Emotion is hard to be defined comprehensively since it is identified in context-dependent situations.

There are three common models to describe and measure emotional states in scientific analysis: 1) distinct categories; 2) pleasure-arousal-dominance model (PAD model); and 3) appraisal approach. The model based on distinct categories is simple and intuitive, which consists of six discrete emotions including happiness, sadness, fear, anger, surprise, and disgust. They are often linked to human facial expression

[19]. The PAD model is a continuous space divided by three orthogonal axes shown in Fig. 1. The pleasure dimension identifies the positivity-negativity of emotional states, the arousal dimension evaluates the degree of physical activity and mental alertness of emotional state, and the dominance dimension indicates to what extend the emotional state is under control or lack of control[84]. Each type of emotion has its place in the three dimensional space including the former mentioned distinct emotions. For instance, sadness is defined as negative pleasure, positive arousal, and negative dominance according to the mapping rules between distinct emotions and the PAD model [49, 52]. This model has been used to study nonverbal communication like body language in psychology[47]. The third model is appraisal model which describes how emotions develop, influence, and are influenced by interaction with the circumstances[63, 68]. Ortony et al. [58]

applied this appraisal theory to their models and found out that the parameters in the environment such as the events or objects may affect the emotion strength. Because of the complexity of individuals’ evaluations (appraisals or estimates) on events that cause specific reactions in themselves, the appraisal method is less often applied for recognition of emotional states than the two former models.

Fig. 1: The pleasure-arousal-dominance space for emotions[84][44]

. The center of each ellipsoid is the mean and the radius is the standard deviation of each emotional state.

Iii Gait

Human gait refers to a periodical forward task that requires precise cooperation of the neural and musculoskeletal systems to achieve dynamic balance. This requirement is so crucial since the human kinematic system controls the advancing with the support from the two feet. In particular, when only one foot provides standing, the body is in a state of imbalance and it needs a complicated mechanism to make accurate and safe foot trajectory. In the next two subsections, we will talk about the gait initiation and the gait cycle since both of them would be affected by emotion.

Iii-a Gait Initiation

As an individual starts to move from the static state, he or she is doing gait initiation. In this initiation, anticipatory postural adjustments (APA) plays an important role in breaking the erect posture and transferring the body center of gravity to the stepping foot[7]. It is a movement involving the swing of a leg forward and leading to imbalance. This imbalance is a result of a swift lateral weight shift and is for generating enough advancing power.

Iii-B Gait Cycle

One gait cycle is measured from one heel-strike to another heel-strike with the same heel [74], which can be described by two cyclic features, phases and events. Both features can be divided into percentage by several key action points (see Fig. 2). Two main phases are shown in the gait cycle: In the stance phase, touching the ground with one foot is called single limb stance. When both feet are on ground, it is called double support. Following the stance phase, the target foot is going to swing and followed by midswing and terminal swing. Gait can be divided into eight events or subphases. Fig. 2 shows an example of gait cycle, which starts with initial contact of the right foot in the stance phase. In the initial contact, the crotch extends to a large angle and the knee flexes. Next is the loading response. It is a progress that transfers the body weight from the left leg to the right. Midstance shows the right leg is on the ground whereas the left leg is in motion and the weight is being transferred between them. When the right heel starts to lift and before the left foot lands on the ground, the individual is in terminal stance. Once the left toe touches the ground following the terminal stance, the weight is again uphold by both limbs and the preswing phase begins. Passing through the right toe-off (initial swing) event, the body weight is transferred from the right to the left this time (the midswing phase). As soon as the right heel strikes again, the whole gait cycle is completed and the walker prepares for next cycle.

Fig. 3 summarizes the emotion models and the gait cycle that we discussed above. It shows the general relationship between emotion and gait, and gives an overview on the content of this paper. It includes the aforementioned content in Section II and Section III, namely the models for describing emotion and the components of gait, respectively. It also contains a brief representation of Section IV and Section V with the upper part listing the gait parameters that would be influenced by emotions and the bottom part showing the process of gait based emotion recognition.

Fig. 2: Diagram of gait cycle. The right foot (red) shows the gait cycle which can be parted into stance and swing in terms of phase. The stance can be further separated into initial contact, loading response, midstance and preswing. Swing contains initial swing, midswing and terminal swing.

Fig. 3: Diagram of general relationship between gait and emotion. Three models are used to describe emotion. The gait initiation and cycle can be influenced by emotion. In the opposite, emotion could be recognized through the gait pattern. PAD: Pleasure-arousal-dominance.

Iv Emotional Impact on Gait

Iv-a On Gait Initiation

Appetitive approach and defensive avoidance are two basic motivational mechanisms for fundamentally organizing emotion in the light of the biphasic theory of emotion [39]. Based on this theory, studies [24, 71, 23, 72, 54]

have been conducted to explore the emotion effects on gait initiation. Theses studies were usually conducted by giving congruent (CO) tasks and incongruent (IC) tasks to the walkers. In the CO tasks, the participants were asked to show approaching when sensing the pleasant stimuli or they should express avoidance responding to the unpleasant stimuli. IC tasks were the opposite to the CO tasks as the participants were asked to make approach movements for the unpleasant stimuli or they needed to performed avoidance at the moment they perceived the pleasant stimuli

[12]. Apparently, the IC tasks were related to the human emotional conflict. To elicit participants’ emotions, the pleasant or unpleasant images from the International Affective Picture System (IAPS) were used as visual stimulus and force plates were used to record the ground reaction forces for the movements.

We summarize the experimental setups and results from the research studies related to emotion impacts on gait initiation in Table I. In [24], the specific paradigm was “go” or “nogo” for volunteers. The “nogo” response (i.e., volunteers were not supposed to move) was related to the stimulus of neutral images showing only an object whereas the “go” response (i.e., volunteers should approach or avoid) was for pleasant or unpleasant pictures corresponding to CO or IC tasks. Volunteers were asked to perform responding action as soon as possible after the onset of the images. In [71], participants were asked to make either an anterior step (approach) or a posterior step (withdrawal) as soon as the image was presented and to remain static till it disappeared. The experiment in [23] studied the influence of a change in the delay between picture onset and the appearance of “go” action for checking people‘s reaction time. The changes in the delay had two conditions, namely the short condition (the word “go” showed 500ms after image onset) and the long condition (the word “go” showed 3000ms after image onset). Research[72] aimed at gait initiation as soon as the participants saw the image (i.e., onset) or as soon as the image disappeared (offset). In clinical studies, the experiment in [54] focused on gait initiation in patients with Parkinson’s disease and the patients were asked to make an initial step with their preferred legs after the image offset and to keep walking at their self-selected pace.

Studies mentioned above revealed that an individual’s emotion can affect gait initiation. When the participants encounter emotional conflicts, they seemed to pose a defensive response for the IC tasks leading to longer reaction time (RT) and shorter step length compared to CO tasks shown in Table I. This is inspiring because these features learned from gait initiation show significant differences between people with positive and negative emotions. The analysis of gait initiation may become a potential method to recognize human’s emotion in the future.

Ref Number of participants CO tasks IC tasks Results
15 (age 20-32,
9 females)
Initiate gait after
pleasant image onset
Initiate gait after
unpleasant image onset
Longer RT in IC than CO trials
and the amplitude of early
postural modifications
was reduced in IC trials.
30 (mean age 22.3 years,
16 females)
Approach after
pleasant image onset
or withdraw after
unpleasant image onset
Approach after
unpleasant image onset
or withdraw after
pleasant image onset
Unpleasant images caused
an initial “freezing” response with
analyses of the preparation,
initiation,and execution of steps.
19 (age 18-26 years,
11 females)
Initiate gait once
the word “go”
appeared 500 ms
or 3000 ms after
pleasant image onset
Initiate gait once
the word “go”
appeared 500 ms
or 3000 ms after
unpleasant image onset
Motor responses were faster for
pleasant pictures than unpleasant ones
in the short delay of 500 ms.
27 (mean age 28.7 years,
16 females)
Initiate gait after
pleasant image onset or
unpleasant image offset
Initiate gait after
unpleasant image onset
or pleasant image offset
Gait was initiated faster
with pleasant images at onset and
faster with unpleasant images
at offset with analyses of COP and COG.
26 patients
(age 55-80 years,3 females)
25 normals
(age 55-80 years,3 females)
Initiate gait and walk
after approach-oriented
emotional picture onset
Initiate gait and walk
after withdrawal-oriented
emotional picture onset
For PD patients and healthy older adults,
threatening pictures speeded the GI
and approach-oriented emotional pictures,
compared to withdrawal-oriented pictures,
facilitated the anticipatory postural
adjustments of gait initiation with analyses
of RT and COP.
  • CO: Congruent. IC: Incongurent. RT: Reaction time. COP: Center-of-pressure. COG: Center-of-gravity. GI: Gait initiation. PD: Parkinson’s disease.

TABLE I: Researches on Impacts of Emotion on Gait Initiation.

Iv-B On Gait Cycle

In this subsection, a few studies would be shared for showing the performances and characteristics of emotive gaits (see Table III). In Montepare’s investigation[51], ten female observers viewed the gaits of five walkers with four various emotions (i.e., sadness, anger, happiness, and pride) in order to determine the walkers’ emotions and report specific gait features observed. Note that the walkers’ heads were not recorded in the gaits to prevent the facial confusion of emotion perception. The investigation showed that gait patterns with different emotions could be identified better than chance level with mean accuracy of 56%, 90%, 74%, 94% for pride, anger, happiness, and sadness respectively. As for the gait features which differentiated emotions, the angry gaits were relatively more heavyfooted than the other gaits, while the sad gaits had less arm swing compared with the other gaits. It also turned out that proud and angry gaits had longer stride lengths than happy or sad gaits. Finally, happy gaits performed faster in pace than the other gaits.

Similarly, thirty observers used Effort-Shape method [38] to rate the qualitative gait movements of sixteen walkers expressing five emotions (i.e., joy, contentment, anger, sadness, and neutral) in Gross’s work [25]. The Effort-Shape analysis involved four factors evaluating the effort in the walker’s movements (i.e., space, time, energy, flow) and two factors described the shape of the body (i.e., torso shape and limb shape) which are shown in Table II. Each factor was rated from 1 to 5. For the instance of flow, the left anchor was “free, relaxed, uncontrolled” and the right anchor was “Bound, tense, controlled”. The three intermediate points in the scale acted a transition between the left and right anchor qualities. Results revealed that the sad gait was featured as having a contracted torso shape, contracted limb shape, indirect space, light energy, and slow time. The angry gait was regarded as having expanded limb shape, direct space, forceful energy, fast time, and tense flow. The joy gait had common characteristics with the angry gait in limb shape, energy, and time, but there was a more expanded torso shape and more relaxed flow for the joy gait than the anger gait. The content gait might look like the joy gait but the former one had more contracted limb shape, less direct space, lighter energy and slower time than the latter one. When it came to the gait pattern in neutral emotion state, it was similar to the sad gait, however, it had a more expanded torso and limb shapes, more direct space, more forceful energy and faster time than the sad gait.

Effort-Shape factor Left-anchor qualities Right-anchor qualities
Torso Shape Contracted, bowed, shrinking Expanded, stretched, growing
Limb Shape Moves close to body, contracted Moves away from body, expanded
Space Indirect, wandering, diffuse Direct, focused, channeled
Energy Light, delicate, buoyant Strong, forceful, powerful
Time Sustained, leisurely, slow Sudden, hurried, fast
Flow Free, relaxed, uncontrolled Bound, tense, controlled
Score = 1.
Score = 5.
TABLE II: Qualities associated with Effort-Shape factors[25].

Another report from human observers for perceiving walkers’ emotions showed that there were significant differences in gait patterns among people with various emotions such as happiness with a bouncing gait, sadness with a slow slouching pace, anger with a fast stomping gait and fear with fast and short gait [28].

Through data analytics, the features in gaits carrying various emotions could be studied through kinematic analysis based on the gait data. In [62], two types of features, 1) posture features and 2) movement features, had been explored to determine which features were the most important in various emotional expression through analyzing the gait trajectory data. For both types of features mentioned above, flexion angles of eleven major joints (head, spine, pelvis and left and right shoulder, elbow, hip and knee joints) have been averaged over the gait cycle. In terms of the posture features, the evident results were the reduced head angle for sad walking, and increased elbow angles for fear and anger while walking. As for movement features, the happy and angry gaits were linked to increased joint amplitudes whereas sadness and fear showed a reduction in joint-angle amplitudes. In addition, the study compared the emotional gaits with neutral gaits whose speeds were matched to the former ones (with overall velocity difference 15%) and it figured out that the dynamics of the emotion-specific features cannot be explained by changes in gait velocity. In Barliya’s study [5], the gait speed was shown to be a crucial parameter that would be affected by various emotions. The amplitude of thigh elevation angles differed from those in the neutral gait for all affections except sadness. Anger was showing more frequently oriented intersegmental plane than others.

Through the kinematic analysis of emotional (i.e., happy, angry, fearful, and sad ) point-light walkers, Halovic and Kroos [28] found out that both happy and angry gaits showed long strides with increased arm movement but angry strides had a faster cadence. Walkers feeling fear were with fast and short strides. Sad walkers had slow short strides gaits representing the slowest walking pace. The fearful and sad gaits both had less arm movement but the former one mainly moved their lower arms whilst the latter one had the entire arms movement.

Studies [34, 35] applying eight-camera optoelectronic motion capture system focused on smoothness of linear and angular movements of body and limbs to explore the emotive impact on gaits. In the vertical direction, the smoothness of movements increased with angry and joyful emotions in the whole body center-of-mass, head, thorax and pelvis compared to sadness. In the anterior-posterior direction, neutral, angry, and joyful emotions only had increased movement smoothness for the head compared to sadness. In angular movements, anger’s movement smoothness in the hip and ankle increased compared to that of sadness. Smoothness in the shoulder increased for anger and joy emotions compared to sadness.

Research in [48, 41] further analyzed the gaits of patients with depression compared wtih the control group. It found that depressed patients showed lower velocity, reduced limb swing and less vertical head movements.

Ref Emotion Number of walkers Methods Performances
anger,and pride
10 females
5 female observers
Mean accuracy: 56%,90%, 74%, 94%
for pride, anger, happiness, and sadness respectively
Anger: heavyfooted.
Sadness: less arm swing.
Proud and anger: longer stride lengths.
Happiness: faster.
and neutrality
11 females and 5 males
30 (15 females) observers
with Effort-Shape method
Sadness: contracted torso shape,
contracted limb shape, indirect space,
light energy, and slow time.
Anger: expanded limb shape,
direct space, forceful energy,
fast time, and tense flow.
Joy: more expanded torso shape
and more relaxed than anger.
Content: more contracted limb shape,
less direct space,lighter energy
and slower time than the joy.
Neutrality: more expanded torso
and limb shapes, more direct space,
more forceful energy  and faster time than sadness.
anger,fear,and neutrality
36 actors (17 females)
34 (19 females) observers
and the kinematic analysis
Happiness and anger: long strides
with increased arm movement
but angry strides had a faster cadence.
Fear: fast and short strides.
Sad: slowest short strides gaits.
Fear and sadness: less arm movement
but fear mainly moved their lower arms
whilst sadness had the entire arms movement
anger,fear,and neutrality
11 males and 12 females
Average flexion angles,
nonlinear mixture model,
and sparse regression
Posture features were the
reduced head angle for sad walking,
and increased elbow angles for fear and anger.
Movement features were increased joint amplitudes
for happiness and anger, and
a reduction in joint-angle amplitudes
for sadness and fear.
Happiness, sadness,
anger,fear,and neutrality
13 university students
and 8 professional actors
The intersegmental
law of coordination
Speed was affected by emotions.
The amplitude of thigh elevation angles
differed from those in neutral gait
for all emotions except sadness.
Anger was showing more
frequently oriented intersegmental
plane than others.
Joy, sadness,
anger,and neutrality
7 males and 11 females
Measuring spatiotemporal
gait parameters
and smoothness of
linear movements
In the VT direction, angry and joyful movement
smoothness increased compared to sadness.
In the AP direction, neutral, angry,
and joyful gaits had increased movement
smoothness for the head compared to sadness.
In angular movements, anger’s smoothness in
the hip and ankle increased compared to sadness.
Smoothness in the shoulder
increased for anger and joy compared to sadness.
and non-depression
14 inpatients
with major depression
and 14 healthy people
movements, VT movements,
and lateral movements
of all body segments
Depression: reduced walking
speed, arm swing, and vertical head movements
and non-depression
16 inpatients
with major depression
and 16 healthy people
Measuring spatiotemporal
gait parameters
Depression: lower gait velocity,
reduced stride length,
double limb support and cycle duration.
  • VT: Vertical. AP: Anterior-posterior.

TABLE III: Researches on Impacts of Emotion on Gait Cycle.

Fig. 4:  25 joints of human generated by Kinect

V Gait Analysis for Emotion Recognition

There is a long way to go towards the ultimate goal, namely, automated emotion recognition through gait patterns. However, there are important observations based on previous studies. Automated process may be possible by aggregating data from previous observations supported by big data analysis and machine learning. In this subsection, we discuss some details of current research on gait based emotion recognition.

V-a Gait Data Collection

Gait can be captured in digital format (i.e. by cameras, force plates) for data analysis. As the technology is advancing, digital devices have increased sensing quality and become more intelligent. In the field of emotion detection based on gait, the force platform was very useful for recording the anteroposterior velocity and displacement of center of foot pressure[31]. The infrared light barrier system functioned as well for measuring gait velocity [41, 31]. More prevailing was the motion analysis systems (e.g., Vicon and Helen Hayes) that could capture precise coordinates information of markers by taping them over people [62, 5, 57, 48, 25, 17, 76, 36]. As another convenient and non-wearable device, Microsoft Kinect, showed up. It was first for interactive games, and then it could be used for motion analysis due to its representation of human skeleton [42, 43]. Fig. 4 shows an individual’s skeleton consisting of marking 25 joints, which were displayed by Kinect V2 device. The joints were figured out based on the three dimension coordinates derived from depth image. In addition, Kinect can provide other types of video, such as RGB video and infrared video. In recent years, intelligent wearable devices got a lot of attention not only in the market, but they could be used in gait pattern analysis for identifying people’s emotion since the accelerometer in the devices collect movement data [85, 60]. In the work of Chiu [13]

, the mobile phone was used for capturing the gait of people and then the video data was sent to a server for pose estimation and emotion recognition.

V-B Preprocessing

Before feeding data to the classifiers or performing correlation analysis, the raw data should be preprocessed to obtain significant and concise features for computation efficiency and performance. Data preprocessing is a crucial step that it may include many substeps such as information filtering that helps to remove noise artifacts and burrs, and to perform data transformation, feature selection and so on. In the following paragraph, we will present some of the preprocessing techniques.

  1. Filtering

    To smooth the data and get rid of the noise for the marker trajectories of an individual’s walking, studies [34, 35, 25, 17] used low-pass Butterworth filter with a cut-off frequency of 6 Hz. Butterworth filter is considered as a maximally flat magnitude filter [9] since the frequency response in the passband has no ripples and rolls off towards zero in the stopband [6].

    Sliding window Gaussian filtering is another common way to eliminate some noises or high-frequency components like jitters. Mathematically, a discrete Gaussian filter transfers the input signal by convolution with a Gaussian function in which there is a significant parameter called standard deviation and the standard deviation is key to designing a Gaussian kernel of fixed length. In studies [42, 43], the Gaussian kernel was for filtering three dimensional coordinates of joints of walkers .

  2. Data Transformation

    Processing the data in the time domain may not be the most effective method. In most of the time, data transformation to other domains, like the frequency domain or time-frequency domain, is favorable, which can make the understanding of the data more thorough.

    One classical and popular method is the discrete Fourier transform (DFT)

    [59]. Discrete Fourier transform derives frequency information of the data, providing the Fourier coefficients that can feature the data. This transform can be applied in three dimensional gait data recorded by Microsoft Kinect devices[42, 43, 73]

    Another method outweighing discrete Fourier transform is the discrete wavelet transform (DWT) that represents the frequency and location (location in time) information, which has been applied in gait studies [4, 29, 55]. In DWT, selecting a wavelet that best suited the analysis is crucial and the choice of the best wavelet is generally based on the the signal characteristics and the applications. For example, for data compression, the wavelet is supposed to represent the largest amount of information with as few coefficients as possible. Many wavelets have been proposed ranging from the Haar, Daubechies, Coiflet, Symmlet, and Mexican Hat to Morlet wavelets and they possess various properties that meet with the needs of the work. For instance, the Daubechies 4 (Db4) is for dealing with signals that have linear approximation over four samples whereas Db6 aims at quadratic approximations over six samples [46].

  3. Feature Selection

    Investigations have been conducted to provide qualitative and quantitative evaluations of multiple features in the human gait. The way to choose the parameters of interest depends on specific application domain. As an example in sports, Electromyography (EMG) signal recorded from muscles was exploited to build a model for determining muscle force-time histories when an individual was walking[80].

    • Spatiotemporal Features

      The gait cycle contains many useful spatiotemporal quantities such as gait velocity, stride and step lengths, step width, single/double support and swing period, phases, rhythm (number of steps per time unit), foot placement through the measurement of time and scale.

    • Kinematic Features

      Gait data of the joints or skeleton collected by marker trajectories, Kinect cameras or video-based pose estimation, are helpful for the measurements of kinematic features, which are not limited to joint angles, angular range of motion, displacement and velocity on basis of various axes[5]. Research in [62] explored the crucial kinematic features related to emotion from gait. It indicated that limb flexion is a key feature for expression of anger and fear on gait whereas head inclination corresponded to sadness dominantly. More specific adoption of kinematic features could be found in the next section.

  4. Dimension Reduction

    • The raw gait data may contain some redundant information that could cause costly expense of computation. Principal component analysis (PCA) is a good way to reduce dimension of data but retain major parts significant to the whole. It transforms data into a set of values of orthogonal variables called principal components. PCA can be regarded as finding “optimal” projection of original data from a m-dimension space to a new n-dimension space, where (see Fig. 5

      ) . From the perspective of math, the principal components in the transformation are linear combinations of the original feature vectors

      [11] and the sum of the coefficients is one. PCA technique was involved in gait analysis researches [81, 69, 56, 65, 16].

Fig. 5: Diagram of PCA.

V-C Emotion Recognition

There have been several efforts to classify a walker’s emotion through their gait data (see Table IV). In [31]

, Janssen et al. used the Kistler force platform to record the ground reaction forces of walkers who were sad, angry or happy through recalling specific occasion when they felt the corresponding emotion. After a short period of elicitation of various emotions, volunteers were asked to walk about 7m at a self-determined gait velocity on the force platform. The ground forces of gait after normalization by amplitude and time were separated into two parts, training set and test set. The training part was put into a supervised Multilayer Perceptrons (MLP) with 200-111-22 (input-hidden-output) neurons (one output neuron per participant). The test part was for the validation that MLP model achieved 95.3% accuracy of recognizing each individual. With the help of unsupervised Self-organizing Maps (SOM), the average emotion recognition rate was 80.8%.

Omlor et al. [57] tried to classify emotions expressed by walkers with attached markers upon joints. They presented an original non-linear source separation method that efficiently dealt with temporal delays of signals. This technique showed superiority to PCA and Independent Component Correlation Algorithm (ICA). The combination of this method and sparse multivariate regression figured out spatio-temporal primitives that were specific for different emotions in gait. The stunning part is that this method approximated movement trajectories very accurately up to 97% based on three learned spatio-temporal source signals.

In [76, 75], feature vectorization (i.e computing the auto-correlation matrix) has been applied in gait trajectories. Four professional actors feeling neutral, joy, anger, sadness, and fear respectively repeated five times walks in a straight line with joint-attached 41 markers of a Vicon motion capture system. In the vector analysis, the angles, position, velocity, and acceleration of joints have been explored for detecting emotions. The study found out that lower torso motion, waist, and head angles could significantly characterize the emotion expression of walkers whereas the leg and arm biased the detection. More importantly with the utilization of the weight on vector, the emotion recognition has been improved to a total average accuracy of 78% for a given volunteer.

Karg et al. studied thoroughly affect recognition based on gait patterns [36]. Statistical parameters of joint angle trajectories and modeling joint angle trajectories by eigenpostures were two ways to extract features in the research. The former consists of three parts. The first part was calculating the velocity, stride length and cadence (VSC). The second part was figuring minimum, mean, and maximum of significant joint angles (i.e., neck angle, shoulder angle, thorax angle) including VSC whereas the third part applied the same method as the one in the second part to all joint angles. These latter two parts were processed independently by PCA, kernel PCA (KPCA), linear discriminant analysis (LDA), and general discriminant analysis (GDA) respectively. The other way based on eigenpostures was to establish the matrix that contained the mean posture, four eigenpostures, four frequencies and three phase shifts (later referred to as PCA-FT-PCA) as shown in Fig. 6

. Given two kinds of features above, for recognition, the next step was to feed them to the classifiers Naive Bayes, Nearest Neighbor (NN) and SVM respectively. The essence of this research was to further explore the interindividual and person-dependent recognition as well as the recognition based on the PAD (pleasure, arousal, dominance) model

[64]. The best accuracy of affect recognition was achieved, 93%, based on the estimated identity as shown in Table IV.

Fig. 6: Component description of . It consists of the mean posture , four eigenpostures , four frequencies , and three phase shifts . [36]

Research in [42, 43]

both used Microsoft Kinect v2 sensor, a low-cost and portable sensor to recognize emotional state through people’s 1-minute gaits in straight line walking back and forth after they watched a 3-minute video clips that may cause various emotion elicitation. There were 59 actors experiencing happiness, anger, and neutral state respectively. Kinect camera captured the gait data in the format of 3D coordinate of 25 joints. After data preprocessing (data segmentation, low-pass filtering, coordinates translation and coordinate difference), the selected joints were then featured in the time and frequency domain. After that they were finally fed into classifiers including the LDA, Naive Bayes, Decision Tree and SVM respectively

[42] (see Fig. 7). The distinct improvements were achieved by PCA with selecting the useful major features that the highest accuracy were 88% using SVM for happiness between happiness and neutral, 80% using Naive Bayes for neutral between anger and neutral and 83% using SVM for anger between happiness and anger whilst Li et al. [43]

used Naive Bayes, Random Forest, SVM and SMO respectively to the skeletons data and the highest accuracy were 80.51% , 79.66%, 55.08% with Naive Bayes (for anger and neutral) , Naive Bayes (for happiness and neutral) , Random Forests (for happiness and anger) accordingly.

Fig. 7:

Procedure of emotion recognition based on Kinect. Joints selections includes 2 wrists, 2 knees and 2 ankles coordinates. Features in time domain were mean and variance of stride, period, velocity, and height whereas frequency features were the amplitudes and phases of top 20 frequencies.

In [85]

, gait pattern was tracked by a customized smart bracelet with built-in accelerometer, which recorded three dimensional acceleration data from different body positions such as right wrist and ankle. In order to remove unexpected waking vibrations and reduce data redundancy, they preprocessed the time series data by using moving average filter and sliding window. Then, with the calculation of skewness, kurtosis and standard deviation of each one of three axes, the temporal domain features were selected. What is more, they used Power Spectral Density (PSD) and Fast Fourier Transform (FFT) to extract temporal and frequency domain features. After this part, the final number of features was 114 (38 features on each axis), and they were fed into 4 different algorithms (Decision Tree, SVM, Random Forest, and Random Tree) respectively. In the eventual result, SVM achieved the highest accuracy of classification, 88.5 %, 91.3%, 88.5%, respectively for happiness and neutral, anger and neutral, happiness and anger. Results indicated that it was efficient to recognize human emotions (happiness, neutral and angry) with gait data recorded by a wearable device.

In the investigation by Juan et al. [60]

, the accelerometer data from the walker‘s smart watch has been used to identify emotion. Fifty participants were divided into groups to perform three tasks: 1)watch movies and walk 2)listen to music and walk 3) listen to music while walking. Movies or music were for eliciting participants happiness or sadness for the later tasks. Either after stimulation or engaging in it, volunteers walked in a 250m S-shaped corridor in a round trip while wearing a smart watch . When it comes to feature selection, the accelerometer data would be divided by sliding windows and features came from each window as a vector. Then Random forest and Logistic Regression handled the classification for happiness and sadness based on the feature vectors, and the majority of accuracies ranged from 60% to 80% compared to the baseline 50%.

Chiu et al. aimed at emotion recognition using mobile devices[13]. Eleven male undergraduate students were recruited to show emotive walks feeling five emotions (i.e., joy, anger, sadness, relaxation, and neutrality) with about ten seconds for each walk when a mobile phone camera was recording from the left side of the subject. After data collection, videos were fed into OpenPose model to extract 18 joints positions in pixel xy-coordinate format for each frame. Then three main features 1) euclidean features 2)angular features and 3) speed were calculated. They were euclidean distances between joint positions (i.e., two hands, left hand and left hip, two knees, left knee and left hip, two elbows, two feet) normalized by the height of the bounding box surrounding subject in pixels and angles of joint positions (i.e., two arms, left and right elbow flexion, two legs, left and right knee flexion, vertical angle of the front thigh, angle between front thigh and torso, vertical angle of head inclination, angle between the head and torso) as well as several types of speed (i.e., average total speed, average each step speed, standard deviation of each step speed, maximum and minimum of each step speed). Then features were used for training six models respectively (i.e., SVM, Multi-layer Perceptron, Decision Tree, Naive Bayes, Random Forest, and Logistic Regression). All the computations were done in a server and the result would be sent back to the client side, the mobile phone. Evaluation results presented that single SVM achieved the highest accuracy of 62.1% in classification compared with other classifiers, while human observers achieved 72% accuracy.

In another research of gait based emotion recognition[1], seven subjects’ five emotive gaits (i.e., happiness, sadness, anger, fear, neutrality) with 20 seconds duration in each walking sequence were recorded in skeleton format by Kinect v2.0 camera. Geometric and kinematic features were calculated using Laban Movement Analysis (LMA) [38]

and those features acted as LMA components from four aspects Body, Effort, Shape, and Space. Then the binary chromosome based genetic algorithm was adopted to determine a subset of features that gained the maximum accuracy of four classifiers (i.e., SVM, KNN, Decision Tree, LDA) for emotion recognition. Furthermore, the four classifiers were combined with Score-level and Rank-level Fusion algorithms to boost the overall performance. At last the Score-level Fusion method helped to achieve 80% emotion recognition accuracy, which outperformed any single one of those four models.

Ref Emotions Features Method/Classifier Accuracy/Approximation
Ground reaction force:
176 gait patterns of 22 actors;
The force in x,y, and z dimensions
normalized by amplitude and time
Multilayer Perceptron,
Self-organizing Maps
for all emotions recognition
Marker trajectories:
195 gait trajectories from 13 lay actors;
Flexion angles of the hip,knee,elbow,
shoulder and the clavicle
Original non-linear
source separation,
Multivariate Regression
97% for describing original data
with only 3 source signals
Marker trajectories:
100 gait trajectories from
4 professional actors;
Angles,position,velocity,and acceleration
of 34 joints and generalized coordinates
of lower torso
Similarity index
78% for all emotions recognition
of an individual
Marker trajectories:
520 gait trajectories of 13 male actors;
Velocity, stride length , cadence,
statistical parameters
of joint angle trajectories,
and modeling joint angle trajectories
by eigenpostures
Naive Bayes,
Highest accuracy for interindividual:
69% (SVM for all joint angles + PCA);
Highest avg accuracy
for person-dependent:
95% (SVM for all joint angles + PCA);
Highest accuracy based on
the estimated identity: 92%
(Naive Bayes for all joint angles + PCA);
Highest accuracy for affective dimensions:
97% for Arousal(KNN for all joint angles)
Kinect skeletons:
59 actors experience
each emotion respectively;
6 joints on arms and legs
in the time and frequency domain
Naive Bayes,
with PCA features
Highest accuracy:
88% (SVM for Happiness
between Happiness and Neutral),
80% (Naive Bayes for Neutral
between Anger and Neutral),
83% (SVM for Anger
between Happiness and Anger)
Kinect skeletons:
59 actors experience
each emotion respectively;
42 main frequencies and
42 corresponding phases of 14 joints
Naive Bayes,
Random Forest,
Highest accuracy:
80.51% (Naive Bayes for
Anger and Neutral),
79.66% (Naive Bayes for
Happiness and Neutral),
55.08% (Random Forest for
Happiness and Anger)
Accelerometer data recorded
by smart bracelet:
123 actors experience
each emotion respectively;
3D acceleration data recorded
on right wrist and ankle
in the time and frequency domain
Decision Tree,
Random Forest,
Random Tree
with PCA features
Highest accuracy:
88.5% (SVM for Happiness and Neutral),
91.3% (SVM for Anger and Neutral),
88.5% (SVM for Happiness and Anger),
81.2% (SVM for three emotions)
Accelerometer data recorded
by smart watch:
50 people divided into
three task groups experienced
happiness and sadness;
Feature vectors extracted
from sliding windows
Random Forest,
Logistic Regression
Accuracies mainly range 60% to 80%
for three groups
Estimated joint positions from video:
11 male walkers experienced
each emotion respectively;
Euclidean features,angular features
of joints,and speed
Decision Tree,
Naive Bayes,
Random Forest,
Logistic Regression,
Multilayer Perceptron
Highest accuracy:
62.1% using SVM
Kinect skeletons:
7 walkers experienced
each emotion respectively;
Geometric and kinematic
features using LMA framework
Decision Tree,
Genetic algorithm,
Score-level Fusion,
Rank-level Fusion
Highest accuracy:
80% using score-level fusion
of all four models
  • KNN: K Nearest Neighbor. SVM: Support Vector Machine. LDA: Linear Discriminant Analysis. LMA: Laban Movement Analysis

TABLE IV: Studies on emotion recognition based on gait.

Vi Future Directions

In this paper, we have gained insight into how emotions may influence on gaits and learned different techniques for emotion recognition. The development of this field has a lot of potentials especially with the advancement of intelligent computation. Based on the current research, we highlight and discuss the future research directions as below.

Non-contact Capturing. Different gait-based analyses depend on different devices that capture and measure relevant gait parameters. There are lots of capturing systems and they might be divided into three subsets: wearable sensors attached onto human body, floor sensors and sensors for image or video processing[53]. It is no doubt about the validity of these types of sensors used in gait detection and measurement as numerous investigations have already demonstrated. Although wearable sensors are very precise to record the trajectories of motion because of their relatively large number of markers and the high sampling rate, they may lead to uncomfortable feeling and unnatural motion as they have to be directly attached to subject’s body. Secondly, force platform requires to be paved on the ground which means it is not advisable to upgrade to large scale due to the expense. On the other hand, as the number of the surveillance cameras increase in various public places such as streets, supermarkets, stations, airports and offices, the advantage of using non-contact gait measuring based on image or video becomes so tremendous. It can achieve big data for training comprehensive models to identify the walkers’ emotion in order to avoid the overfitting problem [40]. In addition, it is a more natural way to record people’s gaits because there is no disturbance to their emotional representation compared to putting markers on their body. Another new technology, depth sensor [86] which can be integrated into RGB sensor, has become popular. It is a remarkable way to measure the distance of things with coordinates data. The studies for identifying emotion on the basis of gait through RGB-D information are still rare, which presumably will become one of breakthrough in the future.

Intelligent Classification. From Table IV

, we find that there is no study in the field of gait-based emotion recognition that takes advantage of deep learning techniques, which probably would become one of the future breakthroughs. Deep learning algorithms help to achieve great success in artificial intelligence. In particular, the convolutional neural networks (CNN) have gained massive achievements for image-based task and the recurrent neural networks (RNN) is able to deal with sequence-based problems

[79]. As for gait recognition, deep CNNs can help a lot in classification as well as dealing with cross-view variance [82] whereas RNN can tackle with the temporal pattern of the people’s gait. Furthermore, the combination of different deep neural networks may yield better outcome since only one single network could not ensure comprehensive solution. For instance, The network C3D+ConvLSTM is hybrid type that fuses CNN and LSTM together. It can lead to good performance for motion recognition [87]. As the gait based emotion recognition is intrinsically a classification task, it should take full advantage of the fusion of various deep neural networks. Furthermore, if the process is supposed to be achieved massively based on RGB or RGB-D information like surveillance in public, there will be some problems to be handled. For example, many subjects will be captured simultaneously in the view which leads to chaos of targeting. The key solution is to do parallel target segmentation [83]. At the same time, the segmentation should be intelligent enough to figure out if the target remains the same identity from the beginning, which still depends on intelligent computation. Apparently, how to make full use of computational intelligence for boosting the classification accuracy is becoming momentous.

Large-scale Data Sets. Along with the development of computation intelligence, big data is desperately required for deep network training in order to facilitate the generalization and robustness of the models. To establish a large and comprehensive data set for gait based emotion recognition, several elements should be taken into account. Firstly, the number of participants should be large enough. The second point is the variety of participants. They should include different genders, wide range of age, various height and weight, with or without carrying things etc. The next point would be the emotion elicitation by volunteers. Participants producing approximate emotion expression is quite important because it leads to clear and fine labels for model training. Thus, finding an appropriate way to elicit volunteers’ emotion naturally is a significant aspect. Getting volunteers to walk in various spatial contexts is another requirement, since walking on different contexts such as on sand or lawn may influence the gait. Last but not least, it is advisable to collect data through different views of cameras and in different formats for enriching the information and training more generalized models. The HumanID Gait Challenge data set may act as great template to learn for establishing the emotion-gait related data set. It consists of 1,870 sequences from 122 subjects spanning five covariates including change in viewing angle, change in shoe type, change in walking surface, carrying or not carrying a briefcase, and elapsed time between sequences being compared [67]. These requirements discussed above are not easy to accomplish but are necessary for building a large scale and comprehensive data set.

Online Emotion Prediction. If the requirements above are fulfilled, as an ultimate goal, online emotion prediction based on human gait is possible. However, online prediction requires data analysis to be run in a continuous way. Different from object detection which can be achieved almost real-time [61], online emotion prediction based on gait has to consider the historic steps since the gait cycle has a duration. Therefore, there is a high demand on computation capability. It is usually time-consuming if the neural network is deep, so real-time recognition could be hard to be ensured. Thus, to develop more efficient methods for online emotion recognition with less computing resources would be a challenging and important task.

Vii Conclusions

This survey paper has provided a comprehensive study on the kinematics in emotional gaits and reviewed emerging studies on gait based emotion recognition. It has discussed technical details in gait analysis for emotion recognition, including data collection, data preprocessing, feature selection, dimension reduction and classification. We want to conclude that gait is a practical modality for emotion recognition, apart from its existing functions of motion abnormality detection, neural system disorder diagnosis, and walker identification for security reasons. Gait analysis is able to fill the gaps in automated emotion recognition when neither speech nor facial expression is feasible in long-distance observation. The technique of gait analysis is of great practical value even though it still has a lot of research challenges. Fortunately, there are more and more noninvasive, cost-effective and unobstrusive sensors available nowadays, like surveillance cameras installed in public, which make the collection of big data possible. Coupling with the development of intelligent computation, gait-based emotion recognition has great potential to be fully automated and improved to a higher level to support a broader range of applications.


  • [1] F. Ahmed, B. Sieu, and M. L. Gavrilova (2018) Score and rank-level fusion for emotion recognition using genetic algorithm. In 2018 IEEE 17th International Conference on Cognitive Informatics & Cognitive Computing (ICCI* CC), pp. 46–53. Cited by: §V-C, TABLE IV.
  • [2] L. Avanzino, G. Lagravinese, G. Abbruzzese, and E. Pelosin (2018) Relationships between gait and emotion in parkinson’s disease: a narrative review. Gait & posture. Cited by: §I.
  • [3] R. Baker, A. Esquenazi, M. G. Benedetti, and K. Desloovere (2016-08) Gait analysis: clinical facts.. European journal of physical and rehabilitation medicine 52 (4), pp. 560–74. External Links: ISSN 1973-9095, Link Cited by: §I.
  • [4] E. Baratin, L. Sugavaneswaran, K. Umapathy, C. Ioana, and S. Krishnan (2015) Wavelet-based characterization of gait signal for neurological abnormalities. Gait & posture 41 (2), pp. 634–639. Cited by: item 2.
  • [5] A. Barliya, L. Omlor, M. A. Giese, A. Berthoz, and T. Flash (2013) Expression of emotion in the kinematics of locomotion. Experimental brain research 225 (2), pp. 159–176. Cited by: §IV-B, TABLE III, 2nd item, §V-A.
  • [6] G. Bianchi and R. Sorrentino (2007) Electronic filter simulation & design. McGraw-Hill New York. Cited by: item 1.
  • [7] S. Bouisset and M. Zattara (1987) Biomechanical study of the programming of anticipatory postural adjustments associated with voluntary movement. journal of Biomechanics 20 (8), pp. 735–742. Cited by: §III-A.
  • [8] N.V. Boulgouris, D. Hatzinakos, and K.N. Plataniotis (2005) Gait recognition: a challenging signal processing technology for biometric identification. IEEE Signal Processing Magazine 22 (6), pp. 78–90. External Links: Document, ISSN 1053-5888, Link Cited by: §I.
  • [9] S. Butterworth (1930) On the theory of filter amplifiers. Wireless Engineer 7 (6), pp. 536–541. Cited by: item 1.
  • [10] G. Castellano, S. D. Villalba, and A. Camurri (2007) Recognising Human Emotions from Body Movement and Gesture Dynamics. In Affective Computing and Intelligent Interaction, pp. 71–82. External Links: Document, Link Cited by: §I.
  • [11] T. Chau (2001) A review of analytical techniques for gait data. part 1: fuzzy, statistical and fractal methods. Gait & posture 13 (1), pp. 49–66. Cited by: 1st item.
  • [12] M. Chen and J. A. Bargh (1999) Consequences of automatic evaluation: immediate behavioral predispositions to approach or avoid the stimulus. Personality and social psychology bulletin 25 (2), pp. 215–224. Cited by: §IV-A.
  • [13] M. Chiu, J. Shu, and P. Hui (2018) Emotion recognition through gait on mobile devices. In 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), pp. 800–805. Cited by: §V-A, §V-C, TABLE IV.
  • [14] M. Coulson (2004) Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. Journal of Nonverbal Behavior 28 (2), pp. 117–139. External Links: Document, ISSN 0191-5886, Link Cited by: §I.
  • [15] J. E. Cutting and L. T. Kozlowski (1977-05) Recognizing friends by their walk: Gait perception without familiarity cues. Bulletin of the Psychonomic Society 9 (5), pp. 353–356. External Links: Document, ISSN 0090-5054, Link Cited by: §I.
  • [16] K. J. Deluzio, U. P. Wyss, B. Zee, P. A. Costigan, and C. Serbie (1997) Principal component models of knee kinematics and kinetics: normal vs. pathological gait patterns. Human Movement Science 16 (2-3), pp. 201–217. Cited by: 1st item.
  • [17] M. Destephe, T. Maruyama, M. Zecca, K. Hashimoto, and A. Takanishi (2013) The influences of emotional intensity for happiness and sadness on walking. In Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE, pp. 7452–7455. Cited by: item 1, §V-A.
  • [18] P. EKMAN and W. V. FRIESEN (1967-06) HEAD AND BODY CUES IN THE JUDGMENT OF EMOTION: A REFORMULATION. Perceptual and Motor Skills 24 (3), pp. 711–724. External Links: Document, ISSN 0031-5125, Link Cited by: 2nd item.
  • [19] P. Ekman (1993) Facial expression and emotion.. American psychologist 48 (4), pp. 384. Cited by: §II.
  • [20] M. El Ayadi, M. S. Kamel, and F. Karray (2011-03) Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognition 44 (3), pp. 572–587. External Links: Document, ISSN 0031-3203, Link Cited by: §I.
  • [21] B. Fasel and J. Luettin (2003-01) Automatic facial expression analysis: a survey. Pattern Recognition 36 (1), pp. 259–275. External Links: Document, ISSN 0031-3203, Link Cited by: §I.
  • [22] J. R. Gage (1993-03) Gait analysis. An essential tool in the treatment of cerebral palsy.. Clinical orthopaedics and related research (288), pp. 126–34. External Links: ISSN 0009-921X, Link Cited by: §I.
  • [23] T. Gélat and C. F. Chapus (2015) Reaction time in gait initiation depends on the time available for affective processing. Neuroscience letters 609, pp. 69–73. Cited by: §IV-A, §IV-A, TABLE I.
  • [24] T. Gélat, L. Coudrat, and A. Le Pellec (2011) Gait initiation is affected during emotional conflict. Neuroscience letters 497 (1), pp. 64–67. Cited by: §IV-A, §IV-A, TABLE I.
  • [25] M. M. Gross, E. A. Crane, and B. L. Fredrickson (2012) Effort-shape and kinematic assessment of bodily expression of emotion during gait. Human movement science 31 (1), pp. 202–221. Cited by: §IV-B, TABLE II, TABLE III, item 1, §V-A.
  • [26] M. M. Gross, E. A. Crane, and B. L. Fredrickson (2012-02) Effort-Shape and kinematic assessment of bodily expression of emotion during gait. Human Movement Science 31 (1), pp. 202–221. External Links: Document, ISSN 0167-9457, Link Cited by: §I.
  • [27] H. Gunes and M. Piccardi (2009-02) Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 39 (1), pp. 64–84. External Links: Document, ISSN 1083-4419, Link Cited by: 1st item.
  • [28] S. Halovic and C. Kroos (2018) Not all is noticed: kinematic cues of emotion-specific gait. Human movement science 57, pp. 478–488. Cited by: §IV-B, §IV-B, TABLE III.
  • [29] A. R. Ismail and S. S. Asfour (1999) Discrete wavelet transform: a tool in smoothing kinematic data. Journal of biomechanics 32 (3), pp. 317–321. Cited by: item 2.
  • [30] J. Jankovic (2008-04) Parkinson’s disease: clinical features and diagnosis.. Journal of neurology, neurosurgery, and psychiatry 79 (4), pp. 368–76. External Links: Document, ISSN 1468-330X, Link Cited by: §I.
  • [31] D. Janssen, W. I. Schöllhorn, J. Lubienetzki, K. Fölling, H. Kokenge, and K. Davids (2008) Recognition of emotions in gait patterns by means of artificial neural nets. Journal of Nonverbal Behavior 32 (2), pp. 79–92. Cited by: §V-A, §V-C, TABLE IV.
  • [32] K. Jellinger, D. Armstrong, H. Y. Zoghbi, and A. K. Percy (1988) Neuropathology of Rett syndrome.. Acta neuropathologica 76 (2), pp. 142–58. External Links: ISSN 0001-6322, Link Cited by: §I.
  • [33] A. Kale, A. Sundaresan, A.N. Rajagopalan, N.P. Cuntoor, A.K. Roy-Chowdhury, V. Kruger, and R. Chellappa (2004-09) Identification of Humans Using Gait. IEEE Transactions on Image Processing 13 (9), pp. 1163–1173. External Links: Document, ISSN 1057-7149, Link Cited by: §I.
  • [34] G. E. Kang and M. M. Gross (2015) Emotional influences on sit-to-walk in healthy young adults. Human movement science 40, pp. 341–351. Cited by: §IV-B, item 1.
  • [35] G. E. Kang and M. M. Gross (2016) The effect of emotion on movement smoothness during gait in healthy young adults. Journal of biomechanics 49 (16), pp. 4022–4027. Cited by: §IV-B, TABLE III, item 1.
  • [36] M. Karg, K. Kuhnlenz, and M. Buss (2010) Recognition of affect based on gait patterns. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 40 (4), pp. 1050–1061. Cited by: Fig. 6, §V-A, §V-C, TABLE IV.
  • [37] K. H. Kim, S. W. Bang, and S. R. Kim (2004-05) Emotion recognition system using short-term monitoring of physiological signals. Medical & Biological Engineering & Computing 42 (3), pp. 419–427. External Links: Document, ISSN 0140-0118, Link Cited by: §I.
  • [38] R. Laban and L. Ullmann (1971) The mastery of movement.. Creative Activities, pp. 200. Cited by: §IV-B, §V-C.
  • [39] P. J. Lang, R. F. Simons, M. Balaban, and R. Simons (2013) Attention and orienting: sensory and motivational processes. Psychology Press. Cited by: §IV-A.
  • [40] Y. LeCun, Y. Bengio, and G. Hinton (2015) Deep learning. nature 521 (7553), pp. 436. Cited by: §VI.
  • [41] M. R. Lemke, T. Wendorff, B. Mieth, K. Buhl, and M. Linnemann (2000) Spatiotemporal gait patterns during over ground locomotion in major depression compared with healthy controls. Journal of psychiatric research 34 (4-5), pp. 277–283. Cited by: §IV-B, TABLE III, §V-A.
  • [42] B. Li, C. Zhu, S. Li, and T. Zhu (2016) Identifying emotions from non-contact gaits information based on microsoft kinects. IEEE Transactions on Affective Computing. Cited by: item 1, item 2, §V-A, §V-C, TABLE IV.
  • [43] S. Li, L. Cui, C. Zhu, B. Li, N. Zhao, and T. Zhu (2016) Emotion recognition using kinect motion capture data of human gaits. PeerJ 4, pp. e2364. Cited by: item 1, item 2, §V-A, §V-C, TABLE IV.
  • [44] X. Li, H. Zhou, S. Song, T. Ran, and X. Fu (2005) The reliability and validity of the chinese version of abbreviated pad emotion scales. In International Conference on Affective Computing and Intelligent Interaction, pp. 513–518. Cited by: Fig. 1.
  • [45] F. Loula, S. Prasad, K. Harber, and M. Shiffrar (2005-02) Recognizing people from their movement.. Journal of experimental psychology. Human perception and performance 31 (1), pp. 210–220. External Links: Document, ISSN 0096-1523, Link Cited by: §I.
  • [46] S. Mallat and W. L. Hwang (1992) Singularity detection and processing with wavelets. IEEE transactions on information theory 38 (2), pp. 617–643. Cited by: item 2.
  • [47] A. Mehrabian (2017) Nonverbal communication. Routledge. Cited by: §II.
  • [48] J. Michalak, N. F. Troje, J. Fischer, P. Vollmar, T. Heidenreich, and D. Schulte (2009) Embodiment of sadness and depression gait patterns associated with dysphoric mood. Psychosomatic medicine 71 (5), pp. 580–587. Cited by: §I, §IV-B, TABLE III, §V-A.
  • [49] J. A. Mikels, B. L. Fredrickson, G. R. Larkin, C. M. Lindberg, S. J. Maglio, and P. A. Reuter-Lorenz (2005) Emotional category data on images from the international affective picture system. Behavior research methods 37 (4), pp. 626–630. Cited by: §II.
  • [50] J. M. Montepare, S. B. Goldstein, and A. Clausen (1987) The identification of emotions from gait information. Journal of Nonverbal Behavior 11 (1), pp. 33–42. External Links: Document, ISSN 0191-5886, Link Cited by: §I.
  • [51] J. M. Montepare, S. B. Goldstein, and A. Clausen (1987) The identification of emotions from gait information. Journal of Nonverbal Behavior 11 (1), pp. 33–42. Cited by: §IV-B, TABLE III.
  • [52] J. D. Morris (1995) Observations: sam: the self-assessment manikin; an efficient cross-cultural measurement of emotional response. Journal of advertising research 35 (6), pp. 63–68. Cited by: §II.
  • [53] A. Muro-De-La-Herran, B. Garcia-Zapirain, and A. Mendez-Zorrilla (2014) Gait analysis methods: an overview of wearable and non-wearable systems, highlighting clinical applications. Sensors 14 (2), pp. 3362–3394. Cited by: §VI.
  • [54] K. M. Naugle, C. J. Hass, D. Bowers, and C. M. Janelle (2012) Emotional state affects gait initiation in individuals with parkinson’s disease. Cognitive, Affective, & Behavioral Neuroscience 12 (1), pp. 207–219. Cited by: §IV-A, §IV-A, TABLE I.
  • [55] M. Nyan, F. Tay, K. Seah, and Y. Sitoh (2006) Classification of gait patterns in the time–frequency domain. Journal of biomechanics 39 (14), pp. 2647–2656. Cited by: item 2.
  • [56] S. J. Olney, M. P. Griffin, and I. D. McBride (1998) Multivariate examination of data from gait analysis of persons with stroke. Physical Therapy 78 (8), pp. 814–828. Cited by: 1st item.
  • [57] L. Omlor and M. A. Giese (2007) Extraction of spatio-temporal primitives of emotional body expressions. Neurocomputing 70 (10-12), pp. 1938–1942. Cited by: §V-A, §V-C, TABLE IV.
  • [58] A. Ortony, G. L. Clore, and A. Collins (1990) The cognitive structure of emotions. Cambridge university press. Cited by: §II.
  • [59] J. G. Proakis (2001) Digital signal processing: principles algorithms and applications. Pearson Education India. Cited by: item 2.
  • [60] J. C. Quiroz, E. Geangu, and M. H. Yong (2018) Emotion recognition using smart watch sensor data: mixed-design study. JMIR mental health 5 (3). Cited by: §V-A, §V-C, TABLE IV.
  • [61] S. Ren, K. He, R. Girshick, and J. Sun (2015) Faster r-cnn: towards real-time object detection with region proposal networks. In Advances in neural information processing systems, pp. 91–99. Cited by: §VI.
  • [62] C. L. Roether, L. Omlor, A. Christensen, and M. A. Giese (2009) Critical features for the perception of emotion from gait. Journal of vision 9 (6), pp. 15–15. Cited by: §IV-B, TABLE III, 2nd item, §V-A.
  • [63] E. T. Rolls (2005) Emotion explained. Oxford University Press, USA. Cited by: §II.
  • [64] J. A. Russell and A. Mehrabian (1977) Evidence for a three-factor theory of emotions. Journal of research in Personality 11 (3), pp. 273–294. Cited by: §V-C.
  • [65] H. Sadeghi, P. Allard, and M. Duhaime (1997) Functional gait asymmetry in able-bodied subjects. Human Movement Science 16 (2-3), pp. 243–258. Cited by: 1st item.
  • [66] E. Sariyanidi, H. Gunes, and A. Cavallaro (2015-06) Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 37 (6), pp. 1113–1133. External Links: Document, ISSN 0162-8828, Link Cited by: §I.
  • [67] S. Sarkar, P. J. Phillips, Z. Liu, I. R. Vega, P. Grother, and K. W. Bowyer (2005) The humanid gait challenge problem: data sets, performance, and analysis. IEEE transactions on pattern analysis and machine intelligence 27 (2), pp. 162–177. Cited by: §VI.
  • [68] K. R. Scherer (1999) Appraisal theory. Handbook of cognition and emotion, pp. 637–663. Cited by: §II.
  • [69] R. Shiavi and P. Griffin (1981) Representing and clustering electromyographic gait patterns with multivariate techniques. Medical and Biological Engineering and Computing 19 (5), pp. 605–611. Cited by: 1st item.
  • [70] B. Stephens-Fripp, F. Naghdy, D. Stirling, and G. Naghdy (2017) Automatic affect perception based on body gait and posture: a survey. International Journal of Social Robotics 9 (5), pp. 617–641. Cited by: §I.
  • [71] J. Stins and P. Beek (2011) Organization of voluntary stepping in response to emotion-inducing pictures. Gait & posture 34 (2), pp. 164–168. Cited by: §IV-A, §IV-A, TABLE I.
  • [72] J. F. Stins, L. M. van Gelder, L. M. Oudenhoven, and P. J. Beek (2015) Biomechanical organization of gait initiation depends on the timing of affective processing. Gait & posture 41 (1), pp. 159–163. Cited by: §IV-A, §IV-A, TABLE I.
  • [73] B. Sun, Z. Zhang, X. Liu, B. Hu, and T. Zhu (2017) Self-esteem recognition based on gait pattern using kinect. Gait & posture 58, pp. 428–432. Cited by: item 2.
  • [74] C. L. Vaughan, B. L. Davis, C. Jeremy, et al. (1999) Dynamics of human gait. Cited by: §III-B.
  • [75] G. Venture (2010-08) Human characterization and emotion characterization from gait. In 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Vol. , pp. 1292–1295. External Links: Document, ISSN 1094-687X Cited by: §V-C.
  • [76] G. Venture, H. Kadone, T. Zhang, J. Grèzes, A. Berthoz, and H. Hicheur (2014) Recognizing emotions conveyed by human gait. International Journal of Social Robotics 6 (4), pp. 621–632. Cited by: §V-A, §V-C, TABLE IV.
  • [77] R. D. Walk and K. L. Walters (1988) Perception of the smile and other emotions of the body and face at different distances. Bulletin of the Psychonomic Society 26 (6), pp. 510. External Links: Link Cited by: 1st item.
  • [78] H. G. Wallbott (1998-11) Bodily expression of emotion. European Journal of Social Psychology 28 (6), pp. 879–896. External Links: Document, ISSN 0046-2772, Link Cited by: §I.
  • [79] P. Wang, W. Li, P. Ogunbona, J. Wan, and S. Escalera (2018) RGB-d-based human motion recognition with deep learning: a survey. Computer Vision and Image Understanding. Cited by: §VI.
  • [80] S. C. White and D. A. Winter (1992) Predicting muscle forces in gait from emg signals and musculotendon kinematics. Journal of Electromyography and Kinesiology 2 (4), pp. 217–231. Cited by: item 3.
  • [81] M. Wootten, M. Kadaba, and G. Cochran (1990) Dynamic electromyography. i. numerical representation using principal component analysis. Journal of Orthopaedic Research 8 (2), pp. 247–258. Cited by: 1st item.
  • [82] Z. Wu, Y. Huang, L. Wang, X. Wang, and T. Tan (2017) A comprehensive study on cross-view gait based human identification with deep cnns. IEEE Transactions on Pattern Analysis & Machine Intelligence (2), pp. 209–226. Cited by: §VI.
  • [83] H. Zacharatos, C. Gatzoulis, and Y. L. Chrysanthou (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE computer graphics and applications 34 (6), pp. 35–45. Cited by: §VI.
  • [84] S. Zhang, Z. Wu, H. M. Meng, and L. Cai (2010) Facial expression synthesis based on emotion dimensions for affective talking avatar. In Modeling machine emotions for realizing intelligence, pp. 109–132. Cited by: Fig. 1, §II.
  • [85] Z. Zhang, Y. Song, L. Cui, X. Liu, and T. Zhu (2016) Emotion recognition based on customized smart bracelet with built-in accelerometer. PeerJ 4, pp. e2258. Cited by: §V-A, §V-C, TABLE IV.
  • [86] Z. Zhang (2012) Microsoft kinect sensor and its effect. IEEE multimedia 19 (2), pp. 4–10. Cited by: §VI.
  • [87] G. Zhu, L. Zhang, P. Shen, and J. Song (2017) Multimodal gesture recognition using 3-d convolution and convolutional lstm. IEEE Access 5, pp. 4517–4524. Cited by: §VI.