LyZNet: A Lightweight Python Tool for Learning and Verifying Neural Lyapunov Functions and Regions of Attraction
Abstract.
In this paper, we describe a lightweight Python framework that provides integrated learning and verification of neural Lyapunov functions for stability analysis. The proposed tool, named LyZNet, learns neural Lyapunov functions using physics-informed neural networks (PINNs) to solve Zubov’s equation and verifies them using satisfiability modulo theories (SMT) solvers. What distinguishes this tool from others in the literature is its ability to provide verified regions of attraction close to the domain of attraction. This is achieved by encoding Zubov’s partial differential equation (PDE) into the PINN approach. By embracing the non-convex nature of the underlying optimization problems, we demonstrate that in cases where convex optimization, such as semidefinite programming, fails to capture the domain of attraction, our neural network framework proves more successful. The tool also offers automatic decomposition of coupled nonlinear systems into a network of low-dimensional subsystems for compositional verification. We illustrate the tool’s usage and effectiveness with several numerical examples, including both non-trivial low-dimensional nonlinear systems and high-dimensional systems.
1. Introduction
Stability analysis of nonlinear dynamical systems has been a focal point of research in control and dynamical systems. In many applications, characterizing the domain of attraction for an asymptotically stable equilibrium point is crucial. For example, in power systems, understanding the domain of attraction is essential for assessing whether the system can recover to a stable equilibrium after experiencing a fault.
Since Lyapunov’s landmark paper more than a hundred years ago (lyapunov1992general), Lyapunov functions have become a cornerstone of nonlinear stability analysis, providing an instrumental tool for performing such analyses and estimating the domain of attraction. Consequently, extensive research has been conducted on Lyapunov functions. One of the key technical challenges is the construction of Lyapunov functions. To address this challenge, both analytical (haddad2008nonlinear; sepulchre2012constructive) and computational methods (giesl2007construction; giesl2015review) have been investigated.
Among computational methods for Lyapunov functions, sums-of-squares (SOS) techniques have garnered widespread attention (papachristodoulou2002construction; papachristodoulou2005tutorial; packard2010help; tan2008stability; topcu2008local; jones2021converse). These methods facilitate not only local stability analysis but also provide estimates of regions of attraction (topcu2008local; tan2008stability; packard2010help). Leveraging semidefinite programming (SDP), one can extend the region of attraction by employing a specific “shape function” within the estimated region. However, selecting such shape functions in a principled manner, beyond standard norm (packard2010help) or quadratic functions (khodadadi2014estimation), remains elusive.
On the other hand, Zubov’s theorem (zubov1964methods) precisely characterizes the domain of attraction through a partial differential equation (PDE). This contrasts with commonly seen Lyapunov conditions, which manifest as partial differential inequalities. The use of an equation, rather than inequalities, enables precise characterization of the domain of attraction. The concept of maximal Lyapunov function (vannelli1985maximal) is closely related to Zubov’s method. The authors of (vannelli1985maximal) have also provided a computational procedure for constructing maximal Lyapunov functions using rational functions.
1.1. Related work
The proposed tool solves Zubov’s equation using physics-informed neural networks (PINNs) (lagaris1998artificial; raissi2019physics) and verifies the neural solutions using satisfiability modulo theories (SMT) solvers. The tool currently relies on dReal (gao2013dreal) for verification through interval analysis.
Computation of Lyapunov functions using Zubov’s equation has been previously explored, for example, with radial basis functions (giesl2007construction) and SOS techniques (jones2021converse) (though the authors did not explicitly mention Zubov’s equation). More recently, the authors of (kang2021data) employed a data-driven approach to solve Zubov’s equation using neural networks; however, the computed neural solutions were not verified, and Zubov’s equation was not directly encoded in the loss function. The authors of (liu2023towards) solved Zubov’s equation using neural networks and verified them with dReal, but did not contrast their results with SOS techniques.
Many authors have recently investigated the use of neural networks for computing Lyapunov functions (see, e.g., (chang2019neural; grune2021overcoming; gaby2022lyapunov; abate2020formal; gaby2022lyapunov; kang2021data), and (dawson2023safe) for a recent survey). In fact, such efforts date back to as early as the 1990s (long1993feedback; prokhorov1994lyapunov). Unlike SDP-based synthesis of SOS Lyapunov functions, neural network Lyapunov functions obtained by training are not guaranteed to be Lyapunov functions. Subsequent verification is required, e.g., using satisfiability modulo theories (SMT) solvers (chang2019neural; ahmed2020automated). The use of SMT solvers for searching and refining Lyapunov functions has been explored previously (kapinski2014simulation). Counterexample-guided search of Lyapunov functions using SMT solvers is investigated in (ahmed2020automated) and the associated tool (abate2021fossil), which supports both Z3 (de2008z3) and dReal (gao2013dreal) as verifiers. Neural Lyapunov functions with SMT verification are explored in (zhou2022neural) for systems with unknown dynamics. SMT verification is often time-consuming, especially when seeking a maximal Lyapunov function (liu2023towards) or dealing with high-dimensional systems. Recent work has also focused on learning neural Lyapunov functions and verifying them through optimization-based techniques, e.g., (chen2021learningHybrid; chen2021learningROA; dai2021lyapunov; dai2020counter). Such techniques usually employ (leaky) ReLU networks and use mixed integer linear/quadratic programming (MILP/MIQP) for verification.
In contrast with existing approaches, the proposed tool offers the ability to compute regions of attraction (ROA) close to the domain of attraction and provides support for compositional verification to handle high-dimensional systems. We demonstrate its usage and effectiveness on both low-dimensional systems with non-trivial dynamics and high-dimensional systems to showcase compositional verification. The tool achieves state-of-the-art results on ROA approximations for nonlinear systems and their verification for high-dimensional systems.
2. Problem formulation
Consider a nonlinear system described by
(1) |
where is continuously differentiable. Suppose that the origin is an asymptotically stable equilibrium for the system. We denote the unique solution to (1) from the initial condition by for , where is the maximal interval of existence for .
The domain of attraction of the origin for (1) is defined as
(2) |
We know that is an open and connected set.
We call any forward invariant subset of a region of attraction (ROA). We are interested in computing regions of attraction, as they not only provide a set of initial conditions with guaranteed convergence to the equilibrium point, but also ensure constraints and safety through forward invariance.
Lyapunov functions provide an indispensable tool for stability analysis and ROA estimates. A Lyapunov function for asymptotic stability of the origin is a continuously differentiable function satisfying , for all , and the derivative of V along solutions of (1) (sometimes called its Lie derivative) satisfies
Given any such function and any positive constant , the sublevel set is a region of attraction, provided that the set is contained in .
The proposed tool is designed to accomplish the following task: given an asymptotically stable equilibrium point, it computes neural Lyapunov functions that can certify regions of attraction with provable guarantees.
What sets our tool and approach apart from existing ones in the literature is its ability to verify regions of attraction close to the domain of attraction by incorporating Zubov’s equation into the training of neural Lyapunov functions.
3. Tool overview and usage
3.1. Tool overview
The tool consists of a set of Python modules built around the machine learning framework PyTorch (paszke2019pytorch) and the numerical satisfiability modulo theories (SMT) solver dReal (gao2013dreal). It also relies on the NumPy library (harris2020array) for numerical computation, SciPy library (virtanen2020scipy) for scientific computing tasks, and the SymPy library (meurer2017sympy) for symbolic calculation.
We describe the main modules as follows:
-
•
dynamical_systems.py defines the DynamicalSystems class and associated methods such as linearization, computation of quadratic Lyapunov function by solving Lyapunov’s equation, and providing interfaces for numerical and symbolic function valuations related to the vector field.
-
•
local_verifier.py provides functions for verifying local stability using quadratic Lyapunov functions, with interval analysis from the SMT solver dReal. The key component is using the mean value theorem to bound the remainder of the linear approximation. This enables exact verification of local asymptotic stability, despite dReal’s numerical conservativeness. This sets it apart from existing methods that often ignore verification around a neighbourhood of the origin.
-
•
quadratic_verifier.py performs reachability analysis using a quadratic Lyapunov function. The target is the local ROA obtained above. The set guaranteed to reach this target is formulated as a sublevel set of the quadratic Lyapunov function.
-
•
generate_data.py generates data for training neural Lyapunov functions. It involves solving system (1), augmented with an additional state to compute value for the solution to Zubov’s equation, which we accomplish using SciPy and “embarrassing parallelization” with the joblib library (joblib).
-
•
neural_learner.py learns a neural Lyapunov function by solving Zubov’s PDE using physics-informed neural networks (PINN) (lagaris1998artificial; raissi2019physics). We allow customization of neural network structure as well as selection of different loss function modes.
-
•
neural_verifier.py verifies the learned neural Lyapunov function indeed satisfies the Lyapunov condition required for reach a target invariant set within the domain of attraction. The target can be set as the region of attraction already verified by local analysis or quadratic Lyapunov analysis above. Our current implementation uses dReal as the verifier.
-
•
network_verifier.py includes functions for compositional verification of local stability using quadratic and neural Lyapunov functions. The underlying theory involves vector Lyapunov functions, differential inequalities, and comparison techniques for nonlinear systems.
-
•
sos_verifier.py, plot.py, and utils.py provide support for comparisons with other Lyapunov functions, such as sums-of-squares (SOS) Lyapunov functions, visualization of verified level sets, and other routines used in learning and verification.
3.2. Tool usage
Using the tool is straightforward. As shown in the code snippet below, we can define a dynamical system with SymPy variables and symbolic expressions for the vector field. We can also specify the domain for learning and training, as well as name the system for file management and result storage.
The code above defines the reversed Van der Pol equation with parameter on the domain .
The following lines of code can be used to call relevant functions for verifying local stability using quadratic Lyapunov function, as well as learning and verifying neural Lyapunov functions.
Most functions and parameters are self-explanatory. We remark that the argument loss_mode in neural_learner currently can be set to three different modes: "Zubov", "Data", and "Lyapunov". The latter two take a purely data-driven approach for solving Zubov’s equation or encode a Lyapunov inequality instead of Zubov’s equation, respectively.
The code above computes and verifies four sublevel sets:
(3) | ||||
satisfying