"AIded with emotions" - a new design approach towards affective computer systems

06/11/2018
by   Barbara Giżycka, et al.
AGH
0

As technologies become more and more pervasive, there is a need for considering the affective dimension of interaction with computer systems to make them more human-like. Current demands for this matter include accurate emotion recognition, reliable emotion modeling, and use of unobtrusive, easily accessible and preferably wearable measurement devices. While AI methods provide many possibilities for better affective information processing, it is not a common scenario for both emotion recognition and modeling to be integrated in the design phase. To address this concern, we propose a new approach based on affective design patterns in the context of video games, together with summary of experiments conducted to test the preliminary hypotheses.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

07/08/2020

Temporal aggregation of audio-visual modalities for emotion recognition

Emotion recognition has a pivotal role in affective computing and in hum...
09/01/2020

Suspect AI: Vibraimage, Emotion Recognition Technology, and Algorithmic Opacity

Vibraimage is a digital system that quantifies a subject's mental and em...
10/13/2019

Interpretable Deep Neural Networks for Dimensional and Categorical Emotion Recognition in-the-wild

Emotions play an important role in people's life. Understanding and reco...
07/29/2020

The BIRAFFE2 Experiment. Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems

The paper describes BIRAFFE2 data set, which is a result of an affective...
01/26/2021

Developing emotion recognition for video conference software to support people with autism

We develop an emotion recognition software for the use with a video conf...
06/29/2018

It All Matters: Reporting Accuracy, Inference Time and Power Consumption for Face Emotion Recognition on Embedded Systems

While several approaches to face emotion recognition task are proposed i...
06/05/2019

CreativeBioMan: Brain and Body Wearable Computing based Creative Gaming System

Current artificial intelligence (AI) technology is mainly used in ration...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

People nowadays deal with computers and other computer-based devices on everyday basis. It has been several decades since technologies ceased to be used just by a narrow group of users (such as engineers or academia workers). However, casual users have different needs and demands regarding the appliances they interact with. Supporting this process to make it as natural and easy as possible becomes a necessary aspect to consider when developing new systems and applications.

New branches of studies emerged that inquire exactly the problems and solutions for bringing people’s encounters with machines to a friendlier level. Human-Computer-Interaction (HCI), together with Affective Computing (AfC), are one of such interdisciplinary domains. Their origins can be traced back to 1980s and 1990s respectively, and both benefit from findings of computer science, psychology, anthropology and ergonomics, among others. HCI embraces a vast range of interaction aspects, including interface design, controls, and usability engineering.

Essentially, usability is a key concept of HCI. Originally it was equated to the degree of simplicity concerning the learning to use the system, using and maintaining it, and it’s infallibility. Today, it has grown to encompass more characteristics, such as fun factor, as well as efficiency and creativity enhancement. Each of these features is influenced by the affective dimension of interaction, meaning that the general emotional state of the user is of crucial importance. As such, this essentially means that human emotions in human-computer interaction have to be taken into consideration [Thompson and McGill2015].

It was year 1997, when Rosalind Picard from MIT Lab published “Affective Computing”, a first handbook of the discipline [Picard1997]. Sometimes referred to as emotion AI, this approach aspires to collect data of human physiological states, as well as information on behavior metrics, and use this data to develop affect models of the user. Later, these can be used to infer on user’s affective responses to what is happening during interaction with the system, and make the software behave just like it “understood” human emotions. The motivation for such an attitude is to make computer systems more human-like, by means of augmenting the interaction with them with the affective dimension. What is more, this provides an answer for the need of personalized systems, capable of adapting to the individual preferences and habits of specific users. Selected fields of application of AfC include context-aware recommendation systems, tutoring applications, therapy and telemedicine, or video games and serious games.

In affective computer systems, large amounts of sensor data need to be recorded and processed. Therefore, firstly, various AI techniques, mainly machine learning and probabilistic graphical models, provide effective means to analyze such immense information, and derive meaningful interpretations of it 

[Koller and Friedman2009]. Additionally, certain methods, such as symbolic reasoning, can make the whole system more understandable for the user, thus improving it’s transparency and implied controllability.

In our research, we set out from the context of applying AfC solutions in the field of video games. We reach for the higher perspective, and aim to show how integration of affect detection and affect modeling can be handled early in the design phase. Furthermore, we emphasize the importance of using AI methods for developing necessary data processing and emotion modeling modes.

This paper’s original contribution is brought by engaging the affective loop mechanism in a way that is introduced on the design level. To ensure dynamic human-computer interaction that takes emotional dimension into the account, we suggest a design proposition based on affective game design patterns. We argue that physiological reaction patterns, corresponding to emotional responses of the person, can be identified and correlated with the affective design patterns [Giżycka and Nalepa2018]. Notably, we presume that the mode of affectively meaningful human-computer interaction can be conducted with use of unobtrusive, wearable sensory devices [Nalepa et al.2018].

All things considered, we propose a new approach to designing affective video games, which enables integration of emotion detection and emotion modeling. We believe that it is important and novel that our design concept allows for these aspects to meet already in the design phase. At the same time, we enunciate the benefits of using AI methods in both of the distinguished processes. An account that we suggest is grounded in the notion of affective loop, which in our proposal is realized by two components: affective game design patterns, and affective physiological reaction patterns of the player. We satisfy the current demand for ubiquitousness of the AfC hardware by focusing on wearable sensors.

The rest of the paper is organized as follows. In Sect. 2 we explain how the affective loop is realized in video games, which is our research ground of choice. Sect. 3 provides a deeper insight into the cornerstone of our suggested approach – the affective game design patterns. Next, in Sect. 4, a description of the experiments conducted for verification of our hypotheses so far is presented. The paper is concluded with an overview of selected related studies in Sect. 5, and also conclusions and future research directions in Sect. 6.

2 Affective Loop and Games

One could say that affective gaming is a Holy Grail of affective computing. This is because it realizes the affective loop [Höök2008] in the most clear and apparent way. The affective game engine reacts in real-time to the player’s (affective) input, by detecting relevant data, interpreting it, and generating response by means of modifying various game parameters. This, in turn, creates new circumstances for the player, to which new emotional reactions may appear, and the loop continues. The changes considered may appear on the level of mechanics, but also of dynamics and even aesthetics [Hunicke et al.2004]. Bringing affective dimension into games elevates the desired designer’s intentions – be it player’s satisfaction or educational or therapeutic impact.

In comparison to other software, games by their nature are an extremely interactive medium. It is common for them to engage in a specific type of communication with the player. This kind of interplay is often evaluated in terms of immersiveness, which refers to the perceived engagement in the challenge that the player willingly takes [Suits2005]. Usually it is directly connected to the notion of flow [Csikszentmihalyi1990], a state where the person stays focused on the performed task while also feeling enjoyed and motivated. This is precisely what is addressed by the affective loop.

Affective loop [Höök2008]

, although being a novel and innovative idea as it is, can use much more improvement. As it has been noted before, affects may arise or emerge as a consequence of many different levels of interaction (from mechanics, through dynamics, to aesthetics). Nowadays, emotion detection and modeling can be supported by Artificial Intelligence (AI) methods, which can contribute to better affect recognition and interpretation. This, however, should be considered early in the design phase.

As such, we suggest a new, improved approach to video game and serious game design. Our aim is to create an environment, where NPC affect models and player affect models converge. To sustain recent standards for pervasive and ubiquitous computing, we focus on easily accessible, off-the-shelf wearable sensors, such as wristbands, as affective data acquisition hardware. Moreover, we recognize the importance of carefully conducted design phase in the context of gaming. To achieve this, we propose a design concept based on affective game design patterns.

3 Affective Game Design Patterns Framework

Some solutions and mechanics in games tend to reappear in many different games, even across various genres. These include, for example, the action of Traversing through the game world or through game levels, Collecting some in-game objects (Pick-ups), providing the player with Perfect (or Imperfect) Information about game states, Cooperation or Competition as general playing strategy, and many more. In a framework proposed by [Björk and Holopainen2005], such pervasive mechanisms, game design patterns, can be distinguished and used by game developers and researchers alike. Besides having such a collection of patterns to recognize different modes and motifs, the benefit comes also from the patterns forming a hierarchy, and complex net of relationships. The patterns may instantiate or modulate each other, or actually suppress other patterns from appearing (see Figure 1). This way, a better, emergent design of the game can be achieved.

Figure 1: An exemplary Game Design Pattern template, one of several that we consider in our research

Research in emotion [Fontaine et al.2007] suggests that several dimensions characteristic for these states can be distinguished. From the affective computing point of view, a more classic approach, taking into account mostly two aspects, is considered to be useful. These qualities are arousal and valence [Mauss and Robinson2009]. Arousal distinguishes the level of activity of a given state and valence is associated with pleasure or lack of it [Russell1980]. Biological connection to these dimensions can be found in reactions related to the Autonomic Nervous System (ANS). Especially, one is able to measure them using wearable devices recording Heart Rate (HR) and Skin Conductance/Galvanic Skin Response (GSR) levels [Cacioppo et al.2000].

Our motivation is the belief that some of the game design patterns by their nature evoke emotional reactions of the player’s ANS. Similarly to other patterns, affective ones can occur on many different levels of gameplay, from mechanics and control modes to game world aesthetics and the interface. We focus on more basic levels of interaction, and suggest that simple game events caused by including those patterns in game design will meet player’s affective response (i. e. stress induced by Time Limit, or appearance of Enemies). Player’s emotion elicitation, in turn, can be observed on the level of physiological signals, including Heart Rate (HR) and Galvanic Skin Response (GSR). By means of correlating the affective game design patterns with the biological reaction patterns, we propose a new approach to game design, which integrates both emotion detection and emotion modeling.

In our proposed framework, the developer has a better understanding of the gaming experience in the design phase, see Figure 2. This is enabled by access to information on the affective nature of various game design patterns, supported by physiological emotional reaction patterns displayed by the player. Knowing which game elements are evoking affective responses, and confronting this with how actually the player reacts to these elements, allows for better realization of the affective loop mechanism. Additionally, as the affective loop relies greatly on efficient processing of large amounts of data and providing convincingly responsive game environment, AI methods can provide just that.

Figure 2: A hierarchy of affective data gathering levels encompassing all affective game modeling perspectives

We are targeting the Unity game engine as our research ground, although there are other alternatives. We are aiming at supporting game designers by a catalog of game design patterns, with the focus of the affective ones. The cataloger will be semantically annotated using the ontology of patters that we are creating. During the design, a recommendation module will help the designer to select proper patterns, as well as relevant alternatives. Furthermore, we are developing an emotion detection layer, currently using the HR and GSR signals. Based on them, an emotion classification layer will be provided. We are currently evaluating several classification techniques, basing on the work described in [Rincon et al.2016]. The emotion interpretation layer will be provided for the game developer to offer basic reasoning about the emotional state of the player. We are aiming to use ontological reasoning [Berthelon and Sander2013]. At this level, it will be possible to connect the symbolic model of emotion of the player with some of the existing models of emotions of NPCs [Gonzalez-Sanchez et al.2011] . In order to verify our account, we provide empirical studies in the next section.

4 Experiments

The experiments described in this section only evaluate our initial assumptions regarding the emotion detection layer, as well as the gathering of data for the classification layer. Furthermore, we mention two practical game prototypes demonstrating the affective loop.

4.1 Outline of procedure

We prepared an examination consisting eventually of three phases. In the first stage, called Calibration Phase, the participant was presented with affective pictures [Marchewka et al.2014] for a fixed amount of time. The task was based on subjective evaluation of arousal evoked by every picture. Application was generated using PsychoPy Builder interface and then reprocessed in Python language. At the same time, the participant of the study wears devices that record HR and GSR. The goal of this phase was to create physiological patterns in response to the presented stimuli. Their preparation was to allow verification of hypothesis regarding the impact of affective game design patterns, and to allow development of applications with affective loop implemented in the program in the future works of other research [Nalepa et al.2017].

Next phase was called Gaming Phase and the subject’s task was to play an affective computer game. It was a platform side-scrolling game, designed using affective game design patterns as adapted from another study [Nalepa et al.2017] (see Figure 3). The picture shows some of the patterns that we consider to have affective impact: Time Limit (a pie chart depicting time left to complete the level), Indirect Information (time and score being represented using a chart and a description, not explicitly indicated numerically, compare with Figure 1) and Enemies (a Crow on the bottom-left part of the Figure 3).

The whole program was created using GameMaker environment. The main task of the player was to navigate the given space in order to get as many points as possible in a given time. As an impediment, we used traps and opponents which disturbed the participant. Devices measuring physiological signals like HR and GSR were also used in this stage of experiment.

In the last phase, the task of the subject was to watch presented neutral (in terms of valence) picture, which was chosen from NAPS database [Marchewka et al.2014]. After fixed time, the sound material was presented to the subject, a female cry with a strong affective impact. The reason for including this step was to acquire participant’s readings of a strong reaction to an affective stimuli, to be used in further pattern development. The wearable devices were used once again.

Figure 3: A screenshot from the gameplay of “London Bridge”, a game designed specifically for this research purposes.

4.2 Used platforms

Physiological signals were measured using several devices. Two of them, BITalino and eHealth, are extendable platforms with boards and sensors coming extra. Others, namely Empatica E4 and Microsoft Band 2, are dedicated for more common usage. Along with sensors, a PC and a smartphone were used in our research.

Data from BITalino and eHealth was collected using standard Bluetooth and USB interfaces. The whole mechanism of acquisition was written in Python language. From the scientific point of view, the method of acquiring of physiological signals from wristbands is more interesting. We decided to prepare our own solution – BandReader – an Android application (for more information on this application see [Kutt et al.]). It enables the user to collect data from several devices simultaneously.

4.3 Experiments summary

During our scientific research we conducted series of experiments, which consisted of different elements. First examination was carried out on 6 people and only with the Calibration phase. This experiment made it possible to test the created application. Gaming Phase was added in another session of tests. This time, 9 people were examined. The results of this study have been published elsewhere, in [Nalepa et al.2017].

In November and January, we conducted two studies to determine whether the devices used by us are suitable for our purposes. We used Neurobit device as a reference for HR and GSR, and also Polar Pulse chest strap for HR. 21 subjects participated. As in the previous tests, we applied first two phases. Experiments in November and January helped us to determine which devices ought to be used in our future work.

Comparison of data from wearable devices and from neuromedical references helped us to determine devices that are highly promising in research. We managed to establish that eHealth correctly measures HR, and Empatica E4 can be useful to collect GSR signals. Unfortunately, Microsoft Band 2 proved to be unreliable as a data acquisition device for our needs. It turned out that data from BITalino is highly accurate both in terms of HR and GSR signals. As such, we decided to focus on BITalino. Furthermore, it is a well supported and low-cost platform.

The last session of experiments was conducted in late March 2018. The goal was to collect data that could be used to create models for emotion detection and classification, enabling the analysis of emotional states. As we opted for BITalino, in this session we used only this single measurement device. We adopted 3 phases of the experiment to acquire data, which contain as many information about emotional states as possible. We managed to examine roughly 100 people in this way. Currently we are still analyzing the data, to build the emotion classification layer.

As it was mentioned, in first experiments we used a prototype game developed with GameMaker. However, to address the industry standards, we are now experimenting with our own game prototypes in Unity. In particular, together with Kamil Osuch from AGH UST, in May 2018 we developed a simple Asteroids game in Unity coupled with BITalino sensors for implementing affective loop. This prototype could be presented during the conference in a form of a playable demo.

5 Related Works

It is evident that the opportunity for AI to support affect models (including the gaming context) has already been recognized. This section provides an overview of selected other related research in this area.

The general AI uses in games in recent years tend to cover four main aspects: player experience models, procedural generation of the content, massive-scale player data mining and enhancing NPC capabilities [Yannakakis2012]. However, this does not seem to be reflected in the advances on affective realism in games, at least not apparently [Hudlicka2009]. Nevertheless, different researchers engaged in studies on the use of, for example, machine learning for example for emotion recognition [Becker et al.2005, Sabourin et al.2011, Rani et al.2006, Shang2017], emotion modeling for providing predictions [Conati and Zhou2002, Zhou and Conati2003, Camilleri et al.2017, Shang2017], for engaging Dynamic Difficulty Adjustment (DDA) [Rani et al.2005, Liu et al.2009], or considering other aspects of game, such as camera control [Yannakakis et al.2010].

Meanwhile, other existing frameworks for affective game design do not seem to reach for AI methods at all. [Kołakowska et al.2013] and [Szwoch2016] propose specific approach towards affective games design process. [Dormann et al.2013], [Caminha2017] and [Nalepa et al.2017] suggest using game design patterns and reappearing game mechanics for development of emotional layer of games. Neither of those accounts refers to facilitation using AI methods directly.

6 Conclusion and Future Plans

The key theme of this paper is to highlight an opportunity for AI to facilitate affective dimension of human-computer interaction. Emotion detection and emotion modeling, when supported by AI methods, can benefit from enhanced accuracy, realism and reliability. However, we believe that both of these aspects should be considered early in the design phase of the system. We choose video game design as our research ground, where the affective loop mechanism can be realized to the greatest extent. We propose a new approach to affective games design, along with a suggestion that it can be expanded to encompass all AI dependent types of software.

Our concept signifies the game design phase as the best moment for introducing the affective loop, thus providing a meeting point of affective data collection and affect models. Both stages of the loop are enhanced by AI techniques. Furthermore, the data collection is included in the loop by means of affective physiological reaction patterns of the player. It is an essential assumption that those emotional responses are in close relation to affective design patterns that underlie the game design.

The future directions of the described research include further and deeper analysis of data acquired during conducted experiments. Specifically, a search of correlations between affective design patterns and affective physiological reaction patterns is anticipated. While we also consider a shift towards another game environment (from GameMaker to Unity), we aim at introducing proper affective loop in the experimental game design as well.

Another issue that will be addressed is the hardware setup. While the quality of raw data acquired is now acceptable, there is still room for improvement regarding the unobtrusiveness requirement for the used devices. As Empatica wristband is quite acceptable in this matter, BITalino and eHealth electrodes have to be reconsidered in terms of, for example, packing them into some 3D printed wearable devices.

We are also working on the integration of BITalino readings with the Unity game engine. To this goal we are using the Unity API developed for BITalino. At the moment, we are exploring how the Unity developed game mechanics interact with real-life data from the player using our hardware setup.

Acknowledgments

The paper is supported by the AGH University research grant.

References

  • [Becker et al.2005] Christian Becker, Arturo Nakasone, Helmut Prendinger, Mitsuro Ishizuka, and Ipke Wachsmuth. Physiologically interactive gaming with the 3d agent max. 2005.
  • [Berthelon and Sander2013] F. Berthelon and P. Sander. Emotion ontology for context awareness. In 2013 IEEE 4th International Conference on Cognitive Infocommunications (CogInfoCom), pages 59–64, Dec 2013.
  • [Björk and Holopainen2005] Staffan Björk and Jussi Holopainen. Patterns in Game Design. Charles River Media, 2005.
  • [Cacioppo et al.2000] John T. Cacioppo, Gary G. Berntson, Jeff T. Larsen, Kirsten M. Poehlmann, and Tiffany A. Ito. The psychophysiology of emotion. In Handbook of emotions, pages 173–191. Guildford Press, 2000.
  • [Camilleri et al.2017] Elizabeth Camilleri, Georgios N Yannakakis, and Antonios Liapis. Towards general models of player affect. In Affective Computing and Intelligent Interaction (ACII), 2017 International Conference on, 2017.
  • [Caminha2017] David Capelo Chaves Caminha. Development of emotional game mechanics through the use of biometric sensors. 2017.
  • [Conati and Zhou2002] Cristina Conati and Xiaoming Zhou. Modeling students’ emotions from cognitive appraisal in educational games. In International Conference on Intelligent Tutoring Systems, pages 944–954. Springer, 2002.
  • [Csikszentmihalyi1990] Mihaly Csikszentmihalyi. Flow: The psychology of optimal performance. NY: Cambridge UniversityPress, 40, 1990.
  • [Dormann et al.2013] Claire Dormann, Jennifer R Whitson, and Max Neuvians. Once more with feeling: Game design patterns for learning in the affective domain. Games and Culture, 8(4):215–237, 2013.
  • [Fontaine et al.2007] Johnny RJ Fontaine, Klaus R Scherer, Etienne B Roesch, and Phoebe C Ellsworth. The world of emotions is not two-dimensional. Psychological science, 18(12):1050–1057, 2007.
  • [Giżycka and Nalepa2018] Barbara Giżycka and Grzegorz J. Nalepa. Emotion in models meets emotion in design: building true affective games. submitted to IEEE GEM 2018, 2018.
  • [Gonzalez-Sanchez et al.2011] J. Gonzalez-Sanchez, M. E. Chavez-Echeagaray, R. Atkinson, and W. Burleson. Abe: An agent-based software architecture for a multimodal emotion recognition framework. In 2011 Ninth Working IEEE/IFIP Conference on Software Architecture, pages 187–193, June 2011.
  • [Höök2008] Kristina Höök. Affective loop experiences–what are they? In International Conference on Persuasive Technology, pages 1–12. Springer, 2008.
  • [Hudlicka2009] Eva Hudlicka. Affective game engines: motivation and requirements. In Proceedings of the 4th international conference on foundations of digital games, pages 299–306. ACM, 2009.
  • [Hunicke et al.2004] Robin Hunicke, Marc LeBlanc, and Robert Zubek. Mda: A formal approach to game design and game research. In Proceedings of the AAAI Workshop on Challenges in Game AI, volume 4, pages 1–5. AAAI Press San Jose, CA, 2004.
  • [Kołakowska et al.2013] Agata Kołakowska, Agnieszka Landowska, Mariusz Szwoch, Wioleta Szwoch, and Michał R Wróbel. Emotion recognition and its application in software engineering. In Human System Interaction (HSI), 2013 The 6th International Conference on, pages 532–539. IEEE, 2013.
  • [Koller and Friedman2009] Daphne Koller and Nir Friedman. Probabilistic Graphical Models: Principles and Techniques. MIT Press, 2009.
  • [Kutt et al.] Krzysztof Kutt, Grzegorz J. Nalepa, Barbara Giżycka, Paweł Jemioło, and Marcin Adamczyk. Bandreader – a mobile application for data acquisition from wearable devices in affective computing experiments. submitted to ICAISC 2018.
  • [Liu et al.2009] Changchun Liu, Pramila Agrawal, Nilanjan Sarkar, and Shuo Chen. Dynamic difficulty adjustment in computer games through real-time anxiety-based affective feedback. International Journal of Human-Computer Interaction, 25(6):506–529, 2009.
  • [Marchewka et al.2014] Artur Marchewka, Łukasz Żurawski, Katarzyna Jednoróg, and Anna Grabowska. The Nencki Affective Picture System (NAPS): Introduction to a novel, standardized, wide-range, high-quality, realistic picture database. Behavior Research Methods, 46(2):596–610, 2014.
  • [Mauss and Robinson2009] Iris B. Mauss and Michael D. Robinson. Measures of emotion: A review. Cognition and Emotion, 23(2):209–237, 2009.
  • [Nalepa et al.2017] Grzegorz J. Nalepa, Barbara Gizycka, Krzysztof Kutt, and Jan K. Argasinski. Affective design patterns in computer games. scrollrunner case study. In Communication Papers of the 2017 Federated Conference on Computer Science and Information Systems, FedCSIS 2017, pages 345–352, 2017.
  • [Nalepa et al.2018] Grzegorz J. Nalepa, Krzysztof Kutt, and Szymon Bobek. Mobile platform for affective context-aware systems. Future Generation Computer Systems, 2018.
  • [Picard1997] Rosalind W. Picard. Affective Computing. MIT Press, 1997.
  • [Rani et al.2005] Pramila Rani, Nilanjan Sarkar, and Changchun Liu. Maintaining optimal challenge in computer games through real-time physiological feedback. In Proceedings of the 11th international conference on human computer interaction, volume 58, pages 22–27, 2005.
  • [Rani et al.2006] Pramila Rani, Changchun Liu, Nilanjan Sarkar, and Eric Vanman. An empirical study of machine learning techniques for affect recognition in human–robot interaction. Pattern Analysis and Applications, 9(1):58–69, 2006.
  • [Rincon et al.2016] Jaime Andres Rincon, Ângelo Costa, Paulo Novais, Vicente Julián, and Carlos Carrascosa. Using non-invasive wearables for detecting emotions with intelligent agents. In Manuel Graña, José Manuel López-Guede, Oier Etxaniz, Álvaro Herrero, Héctor Quintián, and Emilio Corchado, editors, International Joint Conference SOCO’16-CISIS’16-ICEUTE’16 - San Sebastián, Spain, October 19th-21st, 2016, Proceedings, volume 527 of Advances in Intelligent Systems and Computing, pages 73–84, 2016.
  • [Russell1980] J. A. Russell. A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):1161–1178, 1980.
  • [Sabourin et al.2011] Jennifer Sabourin, Bradford Mott, and James C Lester.

    Modeling learner affect with theoretically grounded dynamic bayesian networks.

    In International Conference on Affective Computing and Intelligent Interaction, pages 286–295. Springer, 2011.
  • [Shang2017] Zhengkun Shang. Continuous affect recognition with different features and modeling approaches in evaluation-potency-activity space. Master’s thesis, University of Waterloo, 2017.
  • [Suits2005] B. Suits. The Grasshopper: Games, Life and Utopia. Broadview Press, 2005.
  • [Szwoch2016] Mariusz Szwoch. Evaluation of affective intervention process in development of affect-aware educational video games. In Computer Science and Information Systems (FedCSIS), 2016 Federated Conference on, pages 1675–1679. IEEE, 2016.
  • [Thompson and McGill2015] Nik Thompson and Tanya McGill. Affective human-computer interaction. In Encyclopedia of Information Science and Technology, Third Edition, pages 3712–3720. IGI Global, 2015.
  • [Yannakakis et al.2010] Georgios N Yannakakis, Héctor P Martínez, and Arnav Jhala. Towards affective camera control in games. User Modeling and User-Adapted Interaction, 20(4):313–340, 2010.
  • [Yannakakis2012] Geogios N Yannakakis. Game ai revisited. In Proceedings of the 9th conference on Computing Frontiers, pages 285–292. ACM, 2012.
  • [Zhou and Conati2003] Xiaoming Zhou and Cristina Conati. Inferring user goals from personality and behavior in a causal model of user affect. In Proceedings of the 8th international conference on Intelligent user interfaces, pages 211–218. ACM, 2003.