Learning Nonlinear Loop Invariants with Gated Continuous Logic Networks

03/17/2020
by   Jianan Yao, et al.
0

In many cases, verifying real-world programs requires inferring loop invariants with nonlinear constraints. This is especially true in programs that perform many numerical operations, such as control systems for avionics or industrial plants. Recently, data-driven methods for loop invariant inference have gained popularity, especially on linear loop invariants. However, applying data-driven inference to nonlinear invariants is challenging due to the large numbers of and large magnitudes of high-order terms, the potential for overfitting on samples, and the large space of possible nonlinear inequality bounds. In this paper, we introduce a new neural architecture for general SMT learning, the Gated Continuous Logic Network (G-CLN), and apply it to nonlinear loop invariant learning. G-CLNs extend the Continuous Logic Network architecture with gating units and dropout, which allow the model to robustly learn general invariants over large numbers of terms. To address overfitting that arises from finite program sampling, we introduce fractional sampling—a sound relaxation of loop semantics to continuous functions that facilitates unbounded sampling on the real domain. We also design a new CLN activation function, the Piecewise Biased Quadratic Unit (PBQU), for naturally learning tight inequality bounds. We incorporate these methods into a nonlinear loop invariant inference system that can learn general nonlinear loop invariants. We evaluate our system on a benchmark of nonlinear loop invariants and show it solves 26 out of 27 problems, 3 more than prior work, with an average runtime of 53.3 seconds. We further demonstrate the generic learning ability of G-CLNs by solving all 124 problems in the linear Code2Inv benchmark. We also perform a quantitative stability evaluation and show G-CLNs have a convergence rate of 97.5% on quadratic problems, a 39.2% improvement over CLN models.

READ FULL TEXT

page 1

page 2

page 3

research
03/17/2020

Learning Nonlinear Loop Invariants with Gated Continuous Logic Networks (Extended Version)

Verifying real-world programs often requires inferring loop invariants w...
research
09/25/2019

CLN2INV: Learning Loop Invariants with Continuous Logic Networks

Program verification offers a framework for ensuring program correctness...
research
06/09/2021

Data-Driven Invariant Learning for Probabilistic Programs

Morgan and McIver's weakest pre-expectation framework is one of the most...
research
11/26/2019

OASIS: ILP-Guided Synthesis of Loop Invariants

Finding appropriate inductive loop invariants for a program is a key cha...
research
10/10/2020

Cuvée: Blending SMT-LIB with Programs and Weakest Preconditions

Cuvée is a program verification tool that reads SMT-LIB-like input files...
research
10/12/2020

A Complete Approach to Loop Verification with Invariants and Summaries

Loop invariants characterize the partial result computed by a loop so fa...
research
04/26/2014

SPEEDY: An Eclipse-based IDE for invariant inference

SPEEDY is an Eclipse-based IDE for exploring techniques that assist user...

Please sign up or login with your details

Forgot password? Click here to reset