Automated Formal Synthesis of Neural Barrier Certificates for Dynamical Models
We introduce an automated, formal, counterexample-based approach to synthesise Barrier Certificates (BC) for the safety verification of continuous and hybrid dynamical models. The approach is underpinned by an inductive framework: this is structured as a sequential loop between a learner, which manipulates a candidate BC as a neural network, and a sound verifier, which either certifies through algorithmic proofs the candidate's validity or generates counter-examples to further guide the learner. We compare the approach against state-of-the-art techniques, over polynomial and non-polynomial dynamical models: the outcomes show that we can synthesise sound BCs up to two orders of magnitude faster, with in particular a stark speedup on the verification engine (up to five orders less), whilst needing a far smaller data set (up to three orders less) for the learning part. Beyond the state of the art, we further challenge the (verification side of the) approach on a hybrid dynamical model.
READ FULL TEXT