Computer says "No": The Case Against Empathetic Conversational AI

12/21/2022
by   Alba Curry, et al.
0

Emotions are an integral part of human cognition and they guide not only our understanding of the world but also our actions within it. As such, whether we soothe or flame an emotion is not inconsequential. Recent work in conversational AI has focused on responding empathetically to users, validating and soothing their emotions without a real basis. This AI-aided emotional regulation can have negative consequences for users and society, tending towards a one-noted happiness defined as only the absence of "negative" emotions. We argue that we must carefully consider whether and how to respond to users' emotions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2019

Challenges of Designing HCI for Negative Emotions

Emotions that are perceived as "negative" are inherent in the human expe...
research
10/09/2019

A Deep Learning Based Chatbot for Campus Psychological Therapy

In this paper, we propose Evebot, an innovative, sequence to sequence (S...
research
05/28/2020

Empathic AI Painter: A Computational Creativity System with Embodied Conversational Interaction

There is a growing recognition that artists use valuable ways to underst...
research
10/29/2021

A Protocol for Emotions

We tend to consider emotions a manifestation of our innermost nature of ...
research
09/25/2017

Tweeting AI: Perceptions of Lay vs Expert Twitterati

With the recent advancements in Artificial Intelligence (AI), various or...
research
04/26/2023

A Portrait of Emotion: Empowering Self-Expression through AI-Generated Art

We investigated the potential and limitations of generative artificial i...
research
02/19/2023

RePrompt: Automatic Prompt Editing to Refine AI-Generative Art Towards Precise Expressions

Generative AI models have shown impressive ability to produce images wit...

Please sign up or login with your details

Forgot password? Click here to reset