Differential Equations

Systems of Equations

Systems Diff Equations Logo

Method of Eigenvalues and Eigenvectors

The Concept of Eigenvalues and Eigenvectors

Consider a linear homogeneous system of n differential equations with constant coefficients, which can be written in matrix form as

\[\mathbf{X'}\left( t \right) = A\mathbf{X}\left( t \right),\]

where the following notation is used:

\[\mathbf{X}\left( t \right) = \left[ {\begin{array}{*{20}{c}} {{x_1}\left( t \right)}\\ {{x_2}\left( t \right)}\\ \vdots \\ {{x_n}\left( t \right)} \end{array}} \right],\;\; \mathbf{X'}\left( t \right) = \left[ {\begin{array}{*{20}{c}} {{x'_1}\left( t \right)}\\ {{x'_2}\left( t \right)}\\ \vdots \\ {{x'_n}\left( t \right)} \end{array}} \right],\;\; A = \left[ {\begin{array}{*{20}{c}} {{a_{11}}}&{{a_{12}}}& \cdots &{{a_{1n}}}\\ {{a_{21}}}&{{a_{22}}}& \cdots &{{a_{2n}}}\\ \cdots & \cdots & \cdots & \cdots \\ {{a_{n1}}}&{{a_{n2}}}& \cdots &{{a_{nn}}} \end{array}} \right].\]

We look for non-trivial solutions of the homogeneous system in the form of

\[\mathbf{X}\left( t \right) = {e^{\lambda t}}\mathbf{V},\]

where \(\mathbf{V} \ne 0\) is a constant \(n\)-dimensional vector, which will be defined later.

Substituting the above expression for \(\mathbf{X}\left( t \right)\) into the system of equations, we obtain:

\[\lambda {e^{\lambda t}}\mathbf{V} = A{e^{\lambda t}}\mathbf{V},\;\; \Rightarrow A\mathbf{V} = \lambda \mathbf{V}.\]

This equation means that under the action of a linear operator \(A\) the vector \(\mathbf{V}\) is converted to a collinear vector \(\lambda \mathbf{V}.\) Any vector with this property is called an eigenvector of the linear transformation \(A,\) and the number \(\lambda\) is called an eigenvalue.

Thus, we conclude that in order the vector function \(\mathbf{X}\left( t \right) = {e^{\lambda t}}\mathbf{V}\) be a solution of the homogeneous linear system, it is necessary and sufficient that the number \(\lambda\) be an eigenvalue of the matrix \(A,\) and the vector \(\mathbf{V}\) be the corresponding eigenvector of this matrix.

As it can be seen, the solution of a linear system of equations can be constructed by an algebraic method. Therefore, we provide some necessary information on linear algebra.

Finding Eigenvalues and Eigenvectors of a Linear Transformation

Let's go back to the matrix-vector equation obtained above:

\[A\mathbf{V} = \lambda \mathbf{V}.\]

It can be rewritten as

\[A\mathbf{V} - \lambda \mathbf{V} = \mathbf{0},\]

where \(\mathbf{0}\) is the zero vector.

Recall that the product of the identity matrix \(I\) of order \(n\) and \(n\)-dimensional vector \(\mathbf{V}\) is equal to the vector itself:

\[I\mathbf{V} = \mathbf{V}.\]

Therefore, our equation becomes

\[A\mathbf{V} - \lambda I\mathbf{V} = \mathbf{0}\;\;\; \text{or}\;\;\;\left( {A - \lambda I} \right)\mathbf{V} = \mathbf{0}.\]

It follows from this relationship that the determinant of \({A - \lambda I}\) is zero:

\[\det \left( {A - \lambda I} \right) = 0.\]

Indeed, if we assume that \(\det \left( {A - \lambda I} \right) \ne 0,\) then the matrix will have the inverse matrix \({\left( {A - \lambda I} \right)^{ - 1}}.\) Multiplying on the left both sides of the equation by the inverse matrix \({\left( {A - \lambda I} \right)^{ - 1}},\) we get:

\[{\left( {A - \lambda I} \right)^{ - 1}}\left( {A - \lambda I} \right)\mathbf{V} = {\left( {A - \lambda I} \right)^{ - 1}} \cdot \mathbf{0},\;\; \Rightarrow I\mathbf{V} = \mathbf{0},\;\; \Rightarrow \mathbf{V} = \mathbf{0}.\]

This, however, contradicts to the definition of the eigenvector, which must be different from zero. Consequently, the eigenvalues \(\lambda\) must satisfy the equation

\[\det \left( {A - \lambda I} \right) = 0,\]

which is called the auxiliary or characteristic equation of the linear transformation \(A.\) The polynomial on the left side of the equation is called the characteristic polynomial of the linear transformation (or linear operator) \(A.\) The set of all eigenvalues \({\lambda _1},{\lambda _2}, \ldots ,{\lambda _n}\) forms the spectrum of the operator \(A.\)

So the first step in finding the solution of a system of linear differential equations is solving the auxiliary equation and finding all eigenvalues \({\lambda _1},{\lambda _2}, \ldots ,{\lambda _n}.\)

Next, substituting each eigenvalue \({\lambda _i}\) in the system of equations

\[\left( {A - \lambda I} \right)\mathbf{V} = \mathbf{0}\]

and solving it, we find the eigenvectors corresponding to the given eigenvalue \({\lambda _i}.\) Note that after the substitution of the eigenvalues the system becomes singular, i.e. some of the equations will be the same. This follows from the fact that the determinant of the system is zero. As a result, the system of equations will have an infinite set of solutions, i.e. eigenvectors can be determined only to within a constant factor.

Fundamental System of Solutions of a Linear Homogeneous System

Expanding the determinant of the characteristic equation of the \(n\)th order, we have, in general, the following equation:

\[{\left( { - 1} \right)^n}{\left( {\lambda - {\lambda _1}} \right)^{{k_1}}}{\left( {\lambda - {\lambda _2}} \right)^{{k_2}}} \cdots {\left( {\lambda - {\lambda _m}} \right)^{{k_m}}} = 0,\]

where

\[{k_1} + {k_2} + \cdots + {k_m} = n.\]

Here the number \({k_i}\) is called the algebraic multiplicity of the eigenvalue \({\lambda_i}.\) For each such eigenvalue, there exists \({s_i}\) linearly independent eigenvectors. The number \({s_i}\) is called the geometric multiplicity of the eigenvalue \({\lambda_i}.\)

It is proved in linear algebra that the geometric multiplicity \({s_i}\) does not exceed the algebraic multiplicity \({k_i},\) i.e. the following relation holds:

\[0 \lt {s_i} \le {k_i}.\]

It turns out that the general solution of the homogeneous system essentially depends on the multiplicity of the eigenvalues. Consider the possible cases that arise here.

\(1.\) Case \({s_i} = {k_i} = 1.\) All Roots of the Auxiliary Equation are Real and Distinct.

In this simplest case, each eigenvalue \({\lambda _i}\) has one associated eigenvector \({\mathbf{V}_i}.\) These vectors form a set of linearly independent solutions

\[{\mathbf{X}_1} = {e^{{\lambda _1}t}}{\mathbf{V}_1},\;\; {\mathbf{X}_2} = {e^{{\lambda _2}t}}{\mathbf{V}_2}, \ldots ,\; {\mathbf{X}_n} = {e^{{\lambda _n}t}}{\mathbf{V}_n},\]

that is, a fundamental system of solutions of the homogeneous system.

By the linear independence of the eigenvectors the corresponding Wronskian is different from zero:

\[{W_{\left[ {{\mathbf{X}_1},{\mathbf{X}_2}, \ldots ,{\mathbf{X}_n}} \right]}}\left( t \right) = \left| {\begin{array}{*{20}{c}} {{x_{11}}\left( t \right)}&{{x_{12}}\left( t \right)}& \cdots &{{x_{1n}}\left( t \right)}\\ {{x_{21}}\left( t \right)}&{{x_{22}}\left( t \right)}& \cdots &{{x_{2n}}\left( t \right)}\\ \cdots & \cdots & \cdots & \cdots \\ {{x_{n1}}\left( t \right)}&{{x_{n2}}\left( t \right)}& \cdots &{{x_{nn}}\left( t \right)} \end{array}} \right| = \left| {\begin{array}{*{20}{c}} {{e^{{\lambda _1}t}}{V_{11}}}&{{e^{{\lambda _2}t}}{V_{12}}}& \cdots &{{e^{{\lambda _n}t}}{V_{1n}}}\\ {{e^{{\lambda _1}t}}{V_{21}}}&{{e^{{\lambda _2}t}}{V_{22}}}& \cdots &{{e^{{\lambda _n}t}}{V_{2n}}}\\ \cdots & \cdots & \cdots & \cdots \\ {{e^{{\lambda _1}t}}{V_{n1}}}&{{e^{{\lambda _2}t}}{V_{n2}}}& \cdots &{{e^{{\lambda _n}t}}{V_{nn}}} \end{array}} \right| = {e^{\left( {{\lambda _1} + {\lambda _2} + \cdots + {\lambda _n}} \right)t}} \left| {\begin{array}{*{20}{c}} {{V_{11}}}&{{V_{12}}}& \cdots &{{V_{1n}}}\\ {{V_{21}}}&{{V_{22}}}& \cdots &{{V_{2n}}}\\ \cdots & \cdots & \cdots & \cdots \\ {{V_{n1}}}&{{V_{n2}}}& \cdots &{{V_{nn}}} \end{array}} \right| \ne 0.\]

The general solution is given by

\[\mathbf{X}\left( t \right) = {C_1}{e^{{\lambda _1}t}}{\mathbf{V}_1} + {C_2}{e^{{\lambda _2}t}}{\mathbf{V}_2} + \cdots + {C_n}{e^{{\lambda _n}t}}{\mathbf{V}_n},\]

where \({C_1},\) \({C_2}, \ldots ,\) \({C_n}\) are arbitrary constants.

The auxiliary equation may have complex roots. If all the entries of the matrix \(A\) are real, then the complex roots always appear in pairs of complex conjugate numbers. Suppose that we have a pair of complex eigenvalues \({\lambda _i} = \alpha \pm \beta i.\) This pair of complex conjugate numbers is associated to a pair of linearly independent real solutions of the form

\[{\mathbf{X}_1} = \text{Re} \left[ {{e^{\left( {\alpha \pm \beta i} \right)t}}{\mathbf{V}_i}} \right],\;\; {\mathbf{X}_2} = \text{Im} \left[ {{e^{\left( {\alpha \pm \beta i} \right)t}}{\mathbf{V}_i}} \right].\]

Thus, the real and imaginary parts of the complex solution form a pair of real solutions.

\(2.\) Case \({s_i} = {k_i} \gt 1.\) The Auxiliary Equation Has Multiple Roots, Whose Geometric and Algebraic Multiplicities are Equal.

This case is similar to the previous one. Despite the existence of eigenvalues of multiplicity greater than \(1,\) we can define \(n\) linearly independent eigenvectors. In particular, any symmetric matrix with real entries that has \(n\) eigenvalues, will have \(n\) eigenvectors. Similarly, a unitary matrix has the same properties. In general, a square matrix of size \(n \times n\) must be diagonalizable in order to have \(n\) eigenvectors.

The general solution of the system of \(n\) differential equations can be represented as

\[\mathbf{X}\left( t \right) = \underbrace {{{C_{11}}{e^{{\lambda _1}t}}\mathbf{V}_1^{\left( 1 \right)} + {C_{12}}{e^{{\lambda _1}t}}\mathbf{V}_1^{\left( 2 \right)} + \cdots + {C_{1{k_1}}}{e^{{\lambda _1}t}}\mathbf{V}_1^{\left( {{k_1}} \right)}}}_{{k_1}\;\text{terms}} + \underbrace {{{C_{21}}{e^{{\lambda _2}t}}\mathbf{V}_2^{\left( 1 \right)} + {C_{22}}{e^{{\lambda _2}t}}\mathbf{V}_2^{\left( 2 \right)} + \cdots + {C_{2{k_2}}}{e^{{\lambda _2}t}}\mathbf{V}_2^{\left( {{k_2}} \right)}}}_{{k_2}\;\text{terms}} + \cdots\]

Here the total number of terms is \(n,\) \({C_{ij}}\) are arbitrary constants.

\(3.\) Case \({s_i} \lt {k_i}.\) The Auxiliary Equation Has Multiple Roots, Whose Geometric Multiplicity is Less Than the Algebraic Multiplicity.

In some matrices \(A\) (such matrices are called defective), an eigenvalue \({\lambda_i}\) of multiplicity \({k_i}\) may have fewer than \({k_i}\) linearly independent eigenvectors. In this case, instead of missing eigenvectors we can find so-called generalized eigenvectors, so as to get a set of \(n\) linearly independent vectors and construct the corresponding fundamental system of solution. Two ways are usually used for this purpose:

A detailed description of these methods is presented separately on the specified web pages. Below we consider examples of systems of differential equations corresponding to Cases \(1\) and \(2.\)

See solved problems on Page 2.

Page 1 Page 2