The Role of Normware in Trustworthy and Explainable AI

12/06/2018
by   Giovanni Sileno, et al.
0

For being potentially destructive, in practice incomprehensible and for the most unintelligible, contemporary technology is setting high challenges on our society. New conception methods are urgently required. Reorganizing ideas and discussions presented in AI and related fields, this position paper aims to highlight the importance of normware--that is, computational artifacts specifying norms--with respect to these issues, and argues for its irreducibility with respect to software by making explicit its neglected ecological dimension in the decision-making cycle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2021

The human-AI relationship in decision-making: AI explanation to support people on justifying their decisions

The explanation dimension of Artificial Intelligence (AI) based system h...
research
01/13/2020

Artificial Artificial Intelligence: Measuring Influence of AI 'Assessments' on Moral Decision-Making

Given AI's growing role in modeling and improving decision-making, how a...
research
05/05/2023

Explaining the ghosts: Feminist intersectional XAI and cartography as methods to account for invisible labour

Contemporary automation through AI entails a substantial amount of behin...
research
10/03/2020

Accounts, Accountability and Agency for Safe and Ethical AI

We examine the problem of explainable AI (xAI) and explore what deliveri...
research
03/22/2023

Context, Utility and Influence of an Explanation

Contextual utility theory integrates context-sensitive factors into util...
research
02/10/2020

Steps Towards Value-Aligned Systems

Algorithmic (including AI/ML) decision-making artifacts are an establish...
research
07/13/2023

Is Task-Agnostic Explainable AI a Myth?

Our work serves as a framework for unifying the challenges of contempora...

Please sign up or login with your details

Forgot password? Click here to reset