Automatic Textual Explanations of Concept Lattices

04/17/2023
by   Johannes Hirth, et al.
0

Lattices and their order diagrams are an essential tool for communicating knowledge and insights about data. This is in particular true when applying Formal Concept Analysis. Such representations, however, are difficult to comprehend by untrained users and in general in cases where lattices are large. We tackle this problem by automatically generating textual explanations for lattices using standard scales. Our method is based on the general notion of ordinal motifs in lattices for the special case of standard scales. We show the computational complexity of identifying a small number of standard scales that cover most of the lattice structure. For these, we provide textual explanation templates, which can be applied to any occurrence of a scale in any data domain. These templates are derived using principles from human-computer interaction and allow for a comprehensive textual explanation of lattices. We demonstrate our approach on the spices planner data set, which is a medium sized formal context comprised of fifty-six meals (objects) and thirty-seven spices (attributes). The resulting 531 formal concepts can be covered by means of about 100 standard scales.

READ FULL TEXT
research
04/10/2023

Ordinal Motifs in Lattices

Lattices are a commonly used structure for the representation and analys...
research
07/25/2019

A General Theory of Concept Lattice (II): Tractable Lattice Construction and Implication Extraction

As the second part of the treatise 'A General Theory of Concept Lattice'...
research
06/21/2021

Attribute Selection using Contranominal Scales

Formal Concept Analysis (FCA) allows to analyze binary data by deriving ...
research
11/17/2017

Attentive Explanations: Justifying Decisions and Pointing to the Evidence (Extended Abstract)

Deep models are the defacto standard in visual decision problems due to ...
research
11/25/2020

Right for the Right Concept: Revising Neuro-Symbolic Concepts by Interacting with their Explanations

Most explanation methods in deep learning map importance estimates for a...
research
11/08/2016

On interestingness measures of formal concepts

Formal concepts and closed itemsets proved to be of big importance for k...
research
09/09/2011

Learning Concept Hierarchies from Text Corpora using Formal Concept Analysis

We present a novel approach to the automatic acquisition of taxonomies o...

Please sign up or login with your details

Forgot password? Click here to reset