# Differential Equations

## Systems of Equations # Basic Concepts of Stability Theory

Suppose that a phenomenon is described by the system of n differential equations

$\frac{{d{x_i}}}{{dt}} = {f_i}\left( {t,{x_1},{x_2}, \ldots ,{x_n}} \right),\;\; i = 1,2, \ldots ,n$

with initial conditions

${x_i}\left( {{t_0}} \right) = {x_{i0}},\;\;i = 1,2, \ldots ,n.$

We assume that the functions fi (t, x1, x2, ..., xn) are defined and continuous together with its partial derivatives on the set {t [t0, +∞), xi n}. Then without loss of generality we may assume that the initial time is zero: t0 = 0.

It is convenient to write the system of differential equations in vector form:

$\mathbf{X'} = \mathbf{f}\left( {t,\mathbf{X}} \right),\;\;\text{where}\;\; \mathbf{X} = \left( {{x_1},{x_2}, \ldots ,{x_n}} \right),\;\; \mathbf{f} = \left( {{f_1},{f_2}, \ldots ,{f_n}} \right).$

In real systems, the initial conditions are specified with some precision. This raises the obvious question: how small changes in initial conditions affect the behavior of solutions for large time - in the extreme case when $$t \to \infty?$$

If the trajectory of the system varies little under small perturbations of the initial position, we say that the motion of the system is stable.

A mathematically rigorous definition of stability using $$\varepsilon - \delta$$-notation was proposed in $$1892$$ by the Russian mathematician A.M.Lyapunov ($$1857-1918$$). Let us consider in more detail the concept of stability introduced by Lyapunov.

## Lyapunov Stability

The solution $$\boldsymbol{\varphi} \left( t \right)$$ of the system of differential equations

$\mathbf{X'} = \mathbf{f}\left( {t,\mathbf{X}} \right)$

with initial conditions

$\mathbf{X}\left( 0 \right) = {\mathbf{X}_0}$

is stable (in the sense of Lyapunov) if for any $$\varepsilon \gt 0$$ there exists $$\delta = \delta \left( \varepsilon \right) \gt 0,$$ such that if

$\left| {\mathbf{X}\left( 0 \right) - \boldsymbol{\varphi} \left( 0 \right)} \right| \lt \delta ,\;\; \text{then}\;\;\left| {\mathbf{X}\left( t \right) - \boldsymbol{\varphi} \left( t \right)} \right| \lt \varepsilon$

for all values $$t \ge 0.$$ Otherwise, the solution $$\boldsymbol{\varphi} \left( t \right)$$ is said to be unstable.

As the norm for measuring the distance between two points one can use, for example, the Euclidean metric $$\left\| {{\mathbf{x}_e}} \right\|$$ or Manhattan metric $$\left\| {{\mathbf{x}_m}} \right\|:$$

$\left\| {{\mathbf{x}_e}} \right\| = \sqrt {\sum\limits_{i = 1}^n {{{\left| {{x_i}} \right|}^2}} } ,\;\;\left\| {{\mathbf{x}_m}} \right\| = \sum\limits_{i = 1}^n {\left| {{x_i}} \right|} .$

In the case $$n = 2,$$ Lyapunov stability means that any trajectory $${\mathbf{X}\left( t \right)},$$ which starts at $$\delta \left( \varepsilon \right)$$-neighborhood of the point $${\boldsymbol{\varphi} \left( 0 \right)},$$ remains inside the tube with a maximum radius $$\varepsilon$$ for all $$t \ge 0$$ (Figure $$1$$).

## Asymptotic and Exponential Stability

If the solution $$\boldsymbol{\varphi} \left( t \right)$$ of the system of differential equations is not only stable in the sense of Lyapunov, but also satisfies the relationship

$\lim\limits_{t \to \infty } \left| {\mathbf{X}\left( t \right) - \boldsymbol{\varphi} \left( t \right)} \right| = 0$

provided that

$\left| {\mathbf{X}\left( 0 \right) - \boldsymbol{\varphi} \left( 0 \right)} \right| \lt \delta,$

then we say that the solution $$\boldsymbol{\varphi} \left( t \right)$$ is asymptotically stable.

In this case, all solutions that are sufficiently close to $$\boldsymbol{\varphi} \left( 0 \right)$$ at the initial time, gradually converge to $$\boldsymbol{\varphi} \left( t \right)$$ with increasing $$t.$$ Schematically, this is shown in Figure $$2.$$

If the solution $$\boldsymbol{\varphi} \left( t \right)$$ is asymptotically stable and, in addition, from the condition

$\left| {\mathbf{X}\left( 0 \right) - \boldsymbol{\varphi} \left( 0 \right)} \right| \lt \delta$

it follows that

$\left| {\mathbf{X}\left( t \right) - \boldsymbol{\varphi} \left( t \right)} \right| \le \alpha \left| {\mathbf{X}\left( 0 \right) - \boldsymbol{\varphi} \left( 0 \right)} \right|{e^{ - \beta t}}$

for all $$t \ge 0,$$ we say that the solution $$\boldsymbol{\varphi} \left( t \right)$$ is exponentially stable. In this case all solutions that are close to $$\boldsymbol{\varphi} \left( 0 \right)$$ at the initial time converge to $$\boldsymbol{\varphi} \left( t \right)$$ with the rate (greater than or equal), which is determined by an exponential function with parameters $$\alpha,$$ $$\beta$$ (Figure $$3$$).

The general theory of stability, in addition to stability in the sense of Lyapunov, contains many other concepts and definitions of stable movement. In particular, the concepts of orbital and structural stability are important.

## Orbital Stability

Orbital stability describes the behavior of a closed trajectory (orbit) under the action of small external perturbations.

Consider the autonomous system

$\frac{{d{x_i}}}{{dt}} = {f_i}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right),\;\; {x_i}\left( {{t_0}} \right) = {x_{i0}},\;\;i = 1,2, \ldots ,n,$

that is the system of equations, the right hand side of which does not contain the independent variable $$t.$$ In vector form, the autonomous system is written as

$\mathbf{X'}\left( t \right) = \mathbf{f}\left( \mathbf{X} \right),\;\;\text{where}\;\; \mathbf{X} = \left( {{x_1},{x_2}, \ldots ,{x_n}} \right),\;\; \mathbf{f} = \left( {{f_1},{f_2}, \ldots ,{f_n}} \right).$

Let $$\boldsymbol{\varphi} \left( t \right)$$ be a periodic solution of the given autonomous system, that is has the form of a closed trajectory (orbit).

If for any $$\varepsilon \gt 0$$ there is a constant $$\delta = \delta \left( \varepsilon \right) \gt 0$$ such that the trajectory of any solution $$\mathbf{X}\left( t \right)$$ starting at the $$\delta$$-neighborhood of the trajectory $$\boldsymbol{\varphi} \left( t \right)$$ remains in the $$\varepsilon$$-neighborhood of the trajectory $$\boldsymbol{\varphi} \left( t \right)$$ for all $$t \ge 0,$$ then the trajectory $$\boldsymbol{\varphi} \left( t \right)$$ is called orbitally stable (Figure $$4$$).

By analogy with the asymptotic stability in the sense of Lyapunov, one can also introduce the concept of asymptotic orbital stability. This type of motion occurs, for example, in systems with a limit cycle.

## Structural Stability

Suppose that we have two autonomous systems with similar properties − in the sense that their phase portraits have the same singular points and geometrically similar trajectories. Such systems can be called structurally stable.

In the strict definition, it is required that these systems are orbitally topologically equivalent, i.e. there must be a homeomorphism (this terrible word means one-to-one continuous mapping), which converts the family of trajectories of the first system into the family of trajectories of the second system while preserving the direction of motion. In these terms, the structural stability is defined as follows.

Consider an autonomous system, which in the unperturbed and perturbed state is described, respectively, by two equations:

$\mathbf{X'} = \mathbf{f}\left( \mathbf{X} \right),$
$\mathbf{X'} = \mathbf{f}\left( \mathbf{X} \right) + \varepsilon\mathbf{g}\left( \mathbf{X} \right).$

If for any bounded and continuously differentiable vector function $$\mathbf{g}\left( \mathbf{X} \right)$$ there exists a number $$\varepsilon > 0$$ such that the trajectories of the unperturbed and perturbed systems are orbitally topologically equivalent, then the system is called structurally stable.

## Reduction to the Problem of Stability of the Zero Solution

Let an arbitrary non-autonomous system

$\mathbf{X'} = \mathbf{f}\left( {t,\mathbf{X}} \right)$

be given with the initial condition $$\mathbf{X}\left( 0 \right) = {\mathbf{X}_0}$$ (an IVP or Cauchy problem). Here the vector-valued function $$\mathbf{f}$$ is defined on the set $$\left\{ {t \in \left[ {{t_0}, + \infty } \right),{x_i} \in {\Re^n}} \right\}.$$

Suppose that the system has a solution $$\boldsymbol{\varphi} \left( t \right),$$ the stability of which is to be examined. The stability analysis is simplified if we consider perturbations

$\mathbf{Z}\left( t \right) = \mathbf{X}\left( t \right) - \boldsymbol{\varphi} \left( t \right),$

for which we obtain the differential equation

$\mathbf{Z'}\left( t \right) = \mathbf{f}\left( {t,\mathbf{Z}} \right).$

Obviously, the last equation is satisfied by the trivial solution

$\mathbf{Z}\left( {t,\mathbf{0}} \right) \equiv \mathbf{0},$

which corresponds to the identity

$\mathbf{X}\left( t \right) \equiv \boldsymbol{\varphi} \left( t \right).$

Thus, the study of stability of the solution $$\boldsymbol{\varphi} \left( t \right)$$ can be replaced by the study of stability of the function $$\mathbf{Z}\left( t \right)$$ near the point $$\mathbf{Z} = \mathbf{0}.$$

## Stability of Linear Systems

The linear system

$\mathbf{X'} = A\left( t \right)\mathbf{X} + \mathbf{f}\left( t \right)$

is said to be stable if all its solutions are stable in the sense of Lyapunov.

It turns out that the non-homogeneous linear system is stable with any free term $$\mathbf{f}\left( t \right)$$ if the zero solution of the associated homogeneous system

$\mathbf{X'} = A\left( t \right)\mathbf{X}$

is stable. Therefore, when investigating stability in the class of linear systems, it is sufficient to analyze the homogeneous differential systems. In the simplest case, when the coefficient matrix $$A$$ is constant, the stability conditions are formulated in terms of the eigenvalues of the matrix $$A.$$

Consider the homogeneous linear system

$\mathbf{X'} = A\mathbf{X},$

where $$A$$ is a constant matrix of size $$n \times n.$$ Such a system (which is also autonomous) has the zero solution $$\mathbf{X}\left( t \right) = \mathbf{0}.$$ The stability of this solution is determined by the following theorems.

Let $${\lambda _i}$$ be the eigenvalues of $$A.$$

### Theorem $$1$$.

A linear homogeneous system with constant coefficients is stable in the sense of Lyapunov if and only if all eigenvalues $${\lambda _i}$$ of $$A$$ satisfy the condition

$\text{Re}\left[ {{\lambda _i}} \right] \le 0\;\;\left( {i = 1,2, \ldots ,n} \right),$

If the real part of an eigenvalue is equal to zero, the algebraic and geometric multiplicity of the eigenvalue must be the same (i.e. the corresponding Jordan block must be of size $$1 \times 1$$).

### Theorem $$2$$.

A linear homogeneous system with constant coefficients is asymptotically stable if and only if all eigenvalues $${\lambda _i}$$ have negative real parts:

$\text{Re}\left[ {{\lambda _i}} \right] \lt 0\;\;\left( {i = 1,2, \ldots ,n} \right).$

### Theorem $$3$$.

A linear homogeneous system with constant coefficients is unstable if at least one of the conditions is satisfied:

• The matrix $$A$$ has an eigenvalue $${\lambda _i}$$ with a positive real part;
• The matrix $$A$$ has an eigenvalue $${\lambda _i}$$ with zero real part, and the geometric multiplicity of the eigenvalue $${\lambda _i}$$ is less than its algebraic multiplicity.

The above theorems allow us to study the stability of linear systems with constant coefficients knowing the eigenvalues and eigenvectors.

However, in many cases, the character of stability can be determined by using a criteria of stability without solving the system of equations. One of these is the Routh-Hurwitz stability criterion. It allows to judge the stability of a system knowing only the coefficients of the characteristic equation of the matrix $$A.$$

## Stability in the First Approximation

Consider a nonlinear autonomous system

$\mathbf{X'} = f\left( \mathbf{X} \right).$

Suppose that that the system has the trivial solution $$\mathbf{X} = \mathbf{0},$$ which we will investigate for stability.

Assuming that the functions $${f_i}\left( \mathbf{X} \right)$$ are twice continuously differentiable in a neighborhood of the origin, we can expand the right side in a Maclaurin series:

$\frac{{d{x_1}}}{{dt}} = \frac{{\partial {f_1}}}{{\partial {x_1}}}\left( 0 \right){x_1} + \frac{{\partial {f_1}}}{{\partial {x_2}}}\left( 0 \right){x_2} + \cdots + \frac{{\partial {f_1}}}{{\partial {x_n}}}\left( 0 \right){x_n} + {R_1}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right),$
$\frac{{d{x_2}}}{{dt}} = \frac{{\partial {f_2}}}{{\partial {x_1}}}\left( 0 \right){x_1} + \frac{{\partial {f_2}}}{{\partial {x_2}}}\left( 0 \right){x_2} + \cdots + \frac{{\partial {f_2}}}{{\partial {x_n}}}\left( 0 \right){x_n} + {R_2}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right),$
$\cdots\cdots\cdots\cdots\cdots\cdots\cdots\cdots\cdots\cdots\cdots\cdots$
$\frac{{d{x_n}}}{{dt}} = \frac{{\partial {f_n}}}{{\partial {x_1}}}\left( 0 \right){x_1} + \frac{{\partial {f_n}}}{{\partial {x_2}}}\left( 0 \right){x_2} + \cdots + \frac{{\partial {f_n}}}{{\partial {x_n}}}\left( 0 \right){x_n} + {R_n}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right).$

where the terms $${R_i}$$ describe the terms of the second (and higher) order of smallness with respect to the coordinate functions $${{x_1},{x_2}, \ldots ,{x_n}}.$$

Returning to vector-matrix form, we obtain:

$\mathbf{X'} = J\mathbf{X} + \mathbf{R}\left( \mathbf{X} \right),$

where the Jacobian $$J$$ is given by

$J = \left[ {\begin{array}{*{20}{c}} {\frac{{\partial {f_1}}}{{\partial {x_1}}}}&{\frac{{\partial {f_1}}}{{\partial {x_2}}}}& \vdots &{\frac{{\partial {f_1}}}{{\partial {x_n}}}}\\ {\frac{{\partial {f_2}}}{{\partial {x_1}}}}&{\frac{{\partial {f_2}}}{{\partial {x_2}}}}& \vdots &{\frac{{\partial {f_2}}}{{\partial {x_n}}}}\\ \cdots & \cdots & \vdots & \cdots \\ {\frac{{\partial {f_n}}}{{\partial {x_1}}}}&{\frac{{\partial {f_n}}}{{\partial {x_2}}}}& \vdots &{\frac{{\partial {f_n}}}{{\partial {x_n}}}} \end{array}} \right].$

The values of the partial derivatives in this matrix are calculated at the series expansion point, i.e. in this case, at zero.

In many cases, instead of the original nonlinear autonomous system, we can consider and investigate for stability the corresponding linearized system or the system of equations of the first approximation. The stability of such a system is determined by the following rules:

• If all eigenvalues of the Jacobian $$J$$ have negative real parts, then the zero solution $$\mathbf{X} = \mathbf{0}$$ of the original and linearized systems is asymptotically stable.
• If at least one eigenvalue of the Jacobian $$J$$ has a positive real part, then the zero solution $$\mathbf{X} = \mathbf{0}$$ of the original and linearized systems is unstable.

In critical cases, when the eigenvalues have a real part equal to zero, one should use other methods of stability analysis. The problems on stability in the first approximation are given here.

## Lyapunov Functions

One of the powerful tools for stability analysis of systems of differential equations, including nonlinear systems, are Lyapunov functions. This technique is discussed in detail in the separate web page Method of Lyapunov Functions.

See solved problems on Page 2.