DeepAI AI Chat
Log In Sign Up

Provably Correct Training of Neural Network Controllers Using Reachability Analysis

by   Xiaowu Sun, et al.

In this paper, we consider the problem of training neural network (NN) controllers for cyber-physical systems (CPS) that are guaranteed to satisfy safety and liveness properties. Our approach is to combine model-based design methodologies for dynamical systems with data-driven approaches to achieve this target. Given a mathematical model of the dynamical system, we compute a finite-state abstract model that captures the closed-loop behavior under all possible neural network controllers. Using this finite-state abstract model, our framework identifies the subset of NN weights that are guaranteed to satisfy the safety requirements. During training, we augment the learning algorithm with a NN weight projection operator that enforces the resulting NN to be provably safe. To account for the liveness properties, the proposed framework uses the finite-state abstract model to identify candidate NN weights that may satisfy the liveness properties. Using such candidate NN weights, the proposed framework biases the NN training to achieve the liveness specification. Achieving the guarantees above, can not be ensured without correctness guarantees on the NN architecture, which controls the NN's expressiveness. Therefore, and as a corner step in the proposed framework is the ability to select provably correct NN architectures automatically.


page 1

page 9

page 10


Provably Safe Model-Based Meta Reinforcement Learning: An Abstraction-Based Approach

While conventional reinforcement learning focuses on designing agents th...

Distributed neural network control with dependability guarantees: a compositional port-Hamiltonian approach

Large-scale cyber-physical systems require that control policies are dis...

A Bioinspired Synthetic Nervous System Controller for Pick-and-Place Manipulation

The Synthetic Nervous System (SNS) is a biologically inspired neural net...

Model Zoos: A Dataset of Diverse Populations of Neural Network Models

In the last years, neural networks (NN) have evolved from laboratory env...

Verified Compositions of Neural Network Controllers for Temporal Logic Control Objectives

This paper presents a new approach to design verified compositions of Ne...

EnergyShield: Provably-Safe Offloading of Neural Network Controllers for Energy Efficiency

To mitigate the high energy demand of Neural Network (NN) based Autonomo...

Modularity as a Means for Complexity Management in Neural Networks Learning

Training a Neural Network (NN) with lots of parameters or intricate arch...