DeepAI AI Chat
Log In Sign Up

The Role of Normware in Trustworthy and Explainable AI

by   Giovanni Sileno, et al.
University of Amsterdam

For being potentially destructive, in practice incomprehensible and for the most unintelligible, contemporary technology is setting high challenges on our society. New conception methods are urgently required. Reorganizing ideas and discussions presented in AI and related fields, this position paper aims to highlight the importance of normware--that is, computational artifacts specifying norms--with respect to these issues, and argues for its irreducibility with respect to software by making explicit its neglected ecological dimension in the decision-making cycle.


page 1

page 2

page 3

page 4


Artificial Artificial Intelligence: Measuring Influence of AI 'Assessments' on Moral Decision-Making

Given AI's growing role in modeling and improving decision-making, how a...

Explaining the ghosts: Feminist intersectional XAI and cartography as methods to account for invisible labour

Contemporary automation through AI entails a substantial amount of behin...

Accounts, Accountability and Agency for Safe and Ethical AI

We examine the problem of explainable AI (xAI) and explore what deliveri...

Context, Utility and Influence of an Explanation

Contextual utility theory integrates context-sensitive factors into util...

Steps Towards Value-Aligned Systems

Algorithmic (including AI/ML) decision-making artifacts are an establish...

Is Task-Agnostic Explainable AI a Myth?

Our work serves as a framework for unifying the challenges of contempora...