Basic Concepts of Stability Theory
Suppose that a phenomenon is described by the system of n differential equations
with initial conditions
We assume that the functions fi (t, x1, x2, ..., xn) are defined and continuous together with its partial derivatives on the set {t ∈ [t0, +∞), xi ∈ ℝn}. Then without loss of generality we may assume that the initial time is zero: t0 = 0.
It is convenient to write the system of differential equations in vector form:
In real systems, the initial conditions are specified with some precision. This raises the obvious question: how small changes in initial conditions affect the behavior of solutions for large time - in the extreme case when \(t \to \infty?\)
If the trajectory of the system varies little under small perturbations of the initial position, we say that the motion of the system is stable.
A mathematically rigorous definition of stability using \(\varepsilon - \delta\)-notation was proposed in \(1892\) by the Russian mathematician A.M.Lyapunov (\(1857-1918\)). Let us consider in more detail the concept of stability introduced by Lyapunov.
Lyapunov Stability
The solution \(\boldsymbol{\varphi} \left( t \right)\) of the system of differential equations
with initial conditions
is stable (in the sense of Lyapunov) if for any \(\varepsilon \gt 0\) there exists \(\delta = \delta \left( \varepsilon \right) \gt 0,\) such that if
for all values \(t \ge 0.\) Otherwise, the solution \(\boldsymbol{\varphi} \left( t \right)\) is said to be unstable.
As the norm for measuring the distance between two points one can use, for example, the Euclidean metric \(\left\| {{\mathbf{x}_e}} \right\|\) or Manhattan metric \(\left\| {{\mathbf{x}_m}} \right\|:\)
In the case \(n = 2,\) Lyapunov stability means that any trajectory \({\mathbf{X}\left( t \right)},\) which starts at \(\delta \left( \varepsilon \right)\)-neighborhood of the point \({\boldsymbol{\varphi} \left( 0 \right)},\) remains inside the tube with a maximum radius \(\varepsilon\) for all \(t \ge 0\) (Figure \(1\)).
Asymptotic and Exponential Stability
If the solution \(\boldsymbol{\varphi} \left( t \right)\) of the system of differential equations is not only stable in the sense of Lyapunov, but also satisfies the relationship
provided that
then we say that the solution \(\boldsymbol{\varphi} \left( t \right)\) is asymptotically stable.
In this case, all solutions that are sufficiently close to \(\boldsymbol{\varphi} \left( 0 \right)\) at the initial time, gradually converge to \(\boldsymbol{\varphi} \left( t \right)\) with increasing \(t.\) Schematically, this is shown in Figure \(2.\)
If the solution \(\boldsymbol{\varphi} \left( t \right)\) is asymptotically stable and, in addition, from the condition
it follows that
for all \(t \ge 0,\) we say that the solution \(\boldsymbol{\varphi} \left( t \right)\) is exponentially stable. In this case all solutions that are close to \(\boldsymbol{\varphi} \left( 0 \right)\) at the initial time converge to \(\boldsymbol{\varphi} \left( t \right)\) with the rate (greater than or equal), which is determined by an exponential function with parameters \(\alpha,\) \(\beta\) (Figure \(3\)).
The general theory of stability, in addition to stability in the sense of Lyapunov, contains many other concepts and definitions of stable movement. In particular, the concepts of orbital and structural stability are important.
Orbital Stability
Orbital stability describes the behavior of a closed trajectory (orbit) under the action of small external perturbations.
Consider the autonomous system
that is the system of equations, the right hand side of which does not contain the independent variable \(t.\) In vector form, the autonomous system is written as
Let \(\boldsymbol{\varphi} \left( t \right)\) be a periodic solution of the given autonomous system, that is has the form of a closed trajectory (orbit).
If for any \(\varepsilon \gt 0\) there is a constant \(\delta = \delta \left( \varepsilon \right) \gt 0\) such that the trajectory of any solution \(\mathbf{X}\left( t \right)\) starting at the \(\delta\)-neighborhood of the trajectory \(\boldsymbol{\varphi} \left( t \right)\) remains in the \(\varepsilon\)-neighborhood of the trajectory \(\boldsymbol{\varphi} \left( t \right)\) for all \(t \ge 0,\) then the trajectory \(\boldsymbol{\varphi} \left( t \right)\) is called orbitally stable (Figure \(4\)).
By analogy with the asymptotic stability in the sense of Lyapunov, one can also introduce the concept of asymptotic orbital stability. This type of motion occurs, for example, in systems with a limit cycle.
Structural Stability
Suppose that we have two autonomous systems with similar properties − in the sense that their phase portraits have the same singular points and geometrically similar trajectories. Such systems can be called structurally stable.
In the strict definition, it is required that these systems are orbitally topologically equivalent, i.e. there must be a homeomorphism (this terrible word means one-to-one continuous mapping), which converts the family of trajectories of the first system into the family of trajectories of the second system while preserving the direction of motion. In these terms, the structural stability is defined as follows.
Consider an autonomous system, which in the unperturbed and perturbed state is described, respectively, by two equations:
If for any bounded and continuously differentiable vector function \(\mathbf{g}\left( \mathbf{X} \right)\) there exists a number \(\varepsilon > 0\) such that the trajectories of the unperturbed and perturbed systems are orbitally topologically equivalent, then the system is called structurally stable.
Reduction to the Problem of Stability of the Zero Solution
Let an arbitrary non-autonomous system
be given with the initial condition \(\mathbf{X}\left( 0 \right) = {\mathbf{X}_0}\) (an IVP or Cauchy problem). Here the vector-valued function \(\mathbf{f}\) is defined on the set \(\left\{ {t \in \left[ {{t_0}, + \infty } \right),{x_i} \in {\Re^n}} \right\}.\)
Suppose that the system has a solution \(\boldsymbol{\varphi} \left( t \right),\) the stability of which is to be examined. The stability analysis is simplified if we consider perturbations
for which we obtain the differential equation
Obviously, the last equation is satisfied by the trivial solution
which corresponds to the identity
Thus, the study of stability of the solution \(\boldsymbol{\varphi} \left( t \right)\) can be replaced by the study of stability of the function \(\mathbf{Z}\left( t \right)\) near the point \(\mathbf{Z} = \mathbf{0}.\)
Stability of Linear Systems
The linear system
is said to be stable if all its solutions are stable in the sense of Lyapunov.
It turns out that the non-homogeneous linear system is stable with any free term \(\mathbf{f}\left( t \right)\) if the zero solution of the associated homogeneous system
is stable. Therefore, when investigating stability in the class of linear systems, it is sufficient to analyze the homogeneous differential systems. In the simplest case, when the coefficient matrix \(A\) is constant, the stability conditions are formulated in terms of the eigenvalues of the matrix \(A.\)
Consider the homogeneous linear system
where \(A\) is a constant matrix of size \(n \times n.\) Such a system (which is also autonomous) has the zero solution \(\mathbf{X}\left( t \right) = \mathbf{0}.\) The stability of this solution is determined by the following theorems.
Let \({\lambda _i}\) be the eigenvalues of \(A.\)
Theorem \(1\).
A linear homogeneous system with constant coefficients is stable in the sense of Lyapunov if and only if all eigenvalues \({\lambda _i}\) of \(A\) satisfy the condition
If the real part of an eigenvalue is equal to zero, the algebraic and geometric multiplicity of the eigenvalue must be the same (i.e. the corresponding Jordan block must be of size \(1 \times 1\)).
Theorem \(2\).
A linear homogeneous system with constant coefficients is asymptotically stable if and only if all eigenvalues \({\lambda _i}\) have negative real parts:
Theorem \(3\).
A linear homogeneous system with constant coefficients is unstable if at least one of the conditions is satisfied:
- The matrix \(A\) has an eigenvalue \({\lambda _i}\) with a positive real part;
- The matrix \(A\) has an eigenvalue \({\lambda _i}\) with zero real part, and the geometric multiplicity of the eigenvalue \({\lambda _i}\) is less than its algebraic multiplicity.
The above theorems allow us to study the stability of linear systems with constant coefficients knowing the eigenvalues and eigenvectors.
However, in many cases, the character of stability can be determined by using a criteria of stability without solving the system of equations. One of these is the Routh-Hurwitz stability criterion. It allows to judge the stability of a system knowing only the coefficients of the characteristic equation of the matrix \(A.\)
Stability in the First Approximation
Consider a nonlinear autonomous system
Suppose that that the system has the trivial solution \(\mathbf{X} = \mathbf{0},\) which we will investigate for stability.
Assuming that the functions \({f_i}\left( \mathbf{X} \right)\) are twice continuously differentiable in a neighborhood of the origin, we can expand the right side in a Maclaurin series:
where the terms \({R_i}\) describe the terms of the second (and higher) order of smallness with respect to the coordinate functions \({{x_1},{x_2}, \ldots ,{x_n}}.\)
Returning to vector-matrix form, we obtain:
where the Jacobian \(J\) is given by
The values of the partial derivatives in this matrix are calculated at the series expansion point, i.e. in this case, at zero.
In many cases, instead of the original nonlinear autonomous system, we can consider and investigate for stability the corresponding linearized system or the system of equations of the first approximation. The stability of such a system is determined by the following rules:
- If all eigenvalues of the Jacobian \(J\) have negative real parts, then the zero solution \(\mathbf{X} = \mathbf{0}\) of the original and linearized systems is asymptotically stable.
- If at least one eigenvalue of the Jacobian \(J\) has a positive real part, then the zero solution \(\mathbf{X} = \mathbf{0}\) of the original and linearized systems is unstable.
In critical cases, when the eigenvalues have a real part equal to zero, one should use other methods of stability analysis. The problems on stability in the first approximation are given here.
Lyapunov Functions
One of the powerful tools for stability analysis of systems of differential equations, including nonlinear systems, are Lyapunov functions. This technique is discussed in detail in the separate web page Method of Lyapunov Functions.