Log In Sign Up

Interaction Design for Explainable AI: Workshop Proceedings

by   Prashan Madumal, et al.

As artificial intelligence (AI) systems become increasingly complex and ubiquitous, these systems will be responsible for making decisions that directly affect individuals and society as a whole. Such decisions will need to be justified due to ethical concerns as well as trust, but achieving this has become difficult due to the `black-box' nature many AI models have adopted. Explainable AI (XAI) can potentially address this problem by explaining its actions, decisions and behaviours of the system to users. However, much research in XAI is done in a vacuum using only the researchers' intuition of what constitutes a `good' explanation while ignoring the interaction and the human aspect. This workshop invites researchers in the HCI community and related fields to have a discourse about human-centred approaches to XAI rooted in interaction and to shed light and spark discussion on interaction design challenges in XAI.


page 2

page 4

page 5

page 6

page 14

page 18

page 20

page 21


Reviewing the Need for Explainable Artificial Intelligence (xAI)

The diffusion of artificial intelligence (AI) applications in organizati...

Mediating Community-AI Interaction through Situated Explanation: The Case of AI-Led Moderation

Artificial intelligence (AI) has become prevalent in our everyday techno...

Explaining decisions made with AI: A workbook (Use case 1: AI-assisted recruitment tool)

Over the last two years, The Alan Turing Institute and the Information C...

Explainable AI for B5G/6G: Technical Aspects, Use Cases, and Research Challenges

When 5G began its commercialisation journey around 2020, the discussion ...

Sell Me the Blackbox! Regulating eXplainable Artificial Intelligence (XAI) May Harm Consumers

Recent AI algorithms are blackbox models whose decisions are difficult t...

Explainability in Mechanism Design: Recent Advances and the Road Ahead

Designing and implementing explainable systems is seen as the next step ...