Enabling Verification of Deep Neural Networks in Perception Tasks Using Fuzzy Logic and Concept Embeddings

01/03/2022
by   Gesina Schwalbe, et al.
0

One major drawback of deep convolutional neural networks (CNNs) for use in safety critical applications is their black-box nature. This makes it hard to verify or monitor complex, symbolic requirements on already trained computer vision CNNs. In this work, we present a simple, yet effective, approach to verify that a CNN complies with symbolic predicate logic rules which relate visual concepts. It is the first that (1) does not modify the CNN, (2) may use visual concepts that are no CNN in- or output feature, and (3) can leverage continuous CNN confidence outputs. To achieve this, we newly combine methods from explainable artificial intelligence and logic: First, using supervised concept embedding analysis, the output of a CNN is post-hoc enriched by concept outputs. Second, rules from prior knowledge are modelled as truth functions that accept the CNN outputs, and can be evaluated with little computational overhead. We here investigate the use of fuzzy logic, i.e., continuous truth values, and of proper output calibration, which both theoretically and practically show slight benefits. Applicability is demonstrated on state-of-the-art object detectors for three verification use-cases, where monitoring of rule breaches can reveal detection errors.

READ FULL TEXT

page 7

page 17

research
06/12/2021

Entropy-based Logic Explanations of Neural Networks

Explainable artificial intelligence has rapidly emerged since lawmakers ...
research
08/11/2023

DCNFIS: Deep Convolutional Neuro-Fuzzy Inference System

A key challenge in eXplainable Artificial Intelligence is the well-known...
research
04/28/2023

Evaluating the Stability of Semantic Concept Representations in CNNs for Robust Explainability

Analysis of how semantic concepts are represented within Convolutional N...
research
06/06/2023

Scalable Concept Extraction in Industry 4.0

The industry 4.0 is leveraging digital technologies and machine learning...
research
09/11/2018

Visualizing Convolutional Neural Networks to Improve Decision Support for Skin Lesion Classification

Because of their state-of-the-art performance in computer vision, CNNs a...
research
08/18/2022

A Scalable, Interpretable, Verifiable Differentiable Logic Gate Convolutional Neural Network Architecture From Truth Tables

We propose 𝒯ruth 𝒯able net (𝒯𝒯net), a novel Convolutional Neural Network...
research
05/14/2021

Verification of Size Invariance in DNN Activations using Concept Embeddings

The benefits of deep neural networks (DNNs) have become of interest for ...

Please sign up or login with your details

Forgot password? Click here to reset