On Two XAI Cultures: A Case Study of Non-technical Explanations in Deployed AI System

12/02/2021
by   Helen Jiang, et al.
0

Explainable AI (XAI) research has been booming, but the question "To whom are we making AI explainable?" is yet to gain sufficient attention. Not much of XAI is comprehensible to non-AI experts, who nonetheless, are the primary audience and major stakeholders of deployed AI systems in practice. The gap is glaring: what is considered "explained" to AI-experts versus non-experts are very different in practical scenarios. Hence, this gap produced two distinct cultures of expectations, goals, and forms of XAI in real-life AI deployments. We advocate that it is critical to develop XAI methods for non-technical audiences. We then present a real-life case study, where AI experts provided non-technical explanations of AI decisions to non-technical stakeholders, and completed a successful deployment in a highly regulated industry. We then synthesize lessons learned from the case, and share a list of suggestions for AI experts to consider when explaining AI decisions to non-technical stakeholders.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2023

Towards Regulatable AI Systems: Technical Gaps and Policy Opportunities

There is increasing attention being given to how to regulate AI systems....
research
02/04/2020

Human-centered Explainable AI: Towards a Reflective Sociotechnical Approach

Explanations–a form of post-hoc interpretability–play an instrumental ro...
research
07/16/2019

Mediation Challenges and Socio-Technical Gaps for Explainable Deep Learning Applications

The presumed data owners' right to explanations brought about by the Gen...
research
07/05/2021

An Explainable AI System for the Diagnosis of High Dimensional Biomedical Data

Typical state of the art flow cytometry data samples consists of measure...
research
06/01/2023

Why They're Worried: Examining Experts' Motivations for Signing the 'Pause Letter'

This paper presents perspectives on the state of AI, as held by a sample...
research
03/20/2021

Explaining decisions made with AI: A workbook (Use case 1: AI-assisted recruitment tool)

Over the last two years, The Alan Turing Institute and the Information C...
research
06/05/2023

Automating Style Analysis and Visualization With Explainable AI – Case Studies on Brand Recognition

Incorporating style-related objectives into shape design has been centra...

Please sign up or login with your details

Forgot password? Click here to reset