Generating Probabilistic Safety Guarantees for Neural Network Controllers

03/01/2021
by   Sydney M. Katz, et al.
0

Neural networks serve as effective controllers in a variety of complex settings due to their ability to represent expressive policies. The complex nature of neural networks, however, makes their output difficult to verify and predict, which limits their use in safety-critical applications. While simulations provide insight into the performance of neural network controllers, they are not enough to guarantee that the controller will perform safely in all scenarios. To address this problem, recent work has focused on formal methods to verify properties of neural network outputs. For neural network controllers, we can use a dynamics model to determine the output properties that must hold for the controller to operate safely. In this work, we develop a method to use the results from neural network verification tools to provide probabilistic safety guarantees on a neural network controller. We develop an adaptive verification approach to efficiently generate an overapproximation of the neural network policy. Next, we modify the traditional formulation of Markov decision process (MDP) model checking to provide guarantees on the overapproximated policy given a stochastic dynamics model. Finally, we incorporate techniques in state abstraction to reduce overapproximation error during the model checking process. We show that our method is able to generate meaningful probabilistic safety guarantees for aircraft collision avoidance neural networks that are loosely inspired by Airborne Collision Avoidance System X (ACAS X), a family of collision avoidance systems that formulates the problem as a partially observable Markov decision process (POMDP).

READ FULL TEXT

page 14

page 15

page 16

page 17

page 18

page 20

research
05/14/2021

Verification of Image-based Neural Network Controllers Using Generative Models

Neural networks are often used to process information from image-based s...
research
03/02/2019

Verifying Aircraft Collision Avoidance Neural Networks Through Linear Approximations of Safe Regions

The next generation of aircraft collision avoidance systems frame the pr...
research
03/05/2020

Validation of Image-Based Neural Network Controllers through Adaptive Stress Testing

Neural networks have become state-of-the-art for computer vision problem...
research
06/21/2019

Verification and Control of Turn-Based Probabilistic Real-Time Games

Quantitative verification techniques have been developed for the formal ...
research
12/16/2020

Generate and Verify: Semantically Meaningful Formal Analysis of Neural Network Perception Systems

Testing remains the primary method to evaluate the accuracy of neural ne...
research
01/17/2022

Neural Network Compression of ACAS Xu Early Prototype is Unsafe: Closed-Loop Verification through Quantized State Backreachability

ACAS Xu is an air-to-air collision avoidance system designed for unmanne...
research
07/17/2019

ART: Abstraction Refinement-Guided Training for Provably Correct Neural Networks

Artificial Neural Networks (ANNs) have demonstrated remarkable utility i...

Please sign up or login with your details

Forgot password? Click here to reset