An Evaluation of Generative Pre-Training Model-based Therapy Chatbot for Caregivers

07/28/2021
by   Lu Wang, et al.
0

With the advent of off-the-shelf intelligent home products and broader internet adoption, researchers increasingly explore smart computing applications that provide easier access to health and wellness resources. AI-based systems like chatbots have the potential to provide services that could provide mental health support. However, existing therapy chatbots are often retrieval-based, requiring users to respond with a constrained set of answers, which may not be appropriate given that such pre-determined inquiries may not reflect each patient's unique circumstances. Generative-based approaches, such as the OpenAI GPT models, could allow for more dynamic conversations in therapy chatbot contexts than previous approaches. To investigate the generative-based model's potential in therapy chatbot contexts, we built a chatbot using the GPT-2 model. We fine-tuned it with 306 therapy session transcripts between family caregivers of individuals with dementia and therapists conducting Problem Solving Therapy. We then evaluated the model's pre-trained and the fine-tuned model in terms of basic qualities using three meta-information measurements: the proportion of non-word outputs, the length of response, and sentiment components. Results showed that: (1) the fine-tuned model created more non-word outputs than the pre-trained model; (2) the fine-tuned model generated outputs whose length was more similar to that of the therapists compared to the pre-trained model; (3) both the pre-trained model and fine-tuned model were likely to generate more negative and fewer positive outputs than the therapists. We discuss potential reasons for the problem, the implications, and solutions for developing therapy chatbots and call for investigations of the AI-based system application.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2021

Pre-training also Transfers Non-Robustness

Pre-training has enabled many state-of-the-art results on many tasks. In...
research
07/01/2023

THUIR2 at NTCIR-16 Session Search (SS) Task

Our team(THUIR2) participated in both FOSS and POSS subtasks of the NTCI...
research
10/23/2022

On the Transformation of Latent Space in Fine-Tuned NLP Models

We study the evolution of latent space in fine-tuned NLP models. Differe...
research
07/14/2022

Active Data Pattern Extraction Attacks on Generative Language Models

With the wide availability of large pre-trained language model checkpoin...
research
12/26/2022

Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers

Biological systems in nature have evolved for millions of years to adapt...
research
03/17/2023

Towards AI-controlled FES-restoration of movements: Learning cycling stimulation pattern with reinforcement learning

Functional electrical stimulation (FES) has been increasingly integrated...
research
06/08/2023

Spain on Fire: A novel wildfire risk assessment model based on image satellite processing and atmospheric information

Each year, wildfires destroy larger areas of Spain, threatening numerous...

Please sign up or login with your details

Forgot password? Click here to reset