# Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I/Lecture 27

*Eigenvalues and eigenvectors*

For a reflection at an axis in the plane, certain vectors behave particularly simply. The vectors on the axis are sent to themselves, and the vectors which are orthogonal to the axis are sent to their negatives. For all these vectors, the image under this linear mapping lies on the line spanned by these vectors. In the theory of eigenvalues and eigenvectors, we want to know whether, for a given linear mapping, there exist lines (one-dimensional linear subspaces), which are mapped to themselves. The goal is to find, for the linear mapping, a basis such that the describing matrix is quite simple. Here, an important application is to find solutions for a system of linear differential equations.

Let be a field, a -vector space and

a
linear mapping.
Then an element
, ,
is called an *eigenvector* of
(for the
eigenvalue
),
if

for some

holds.Let be a field, a -vector space and

a
linear mapping.
Then an element
is called an *eigenvalue* for , if there exists a vector
,
such that

Let be a field, a -vector space and

a linear mapping. For , we denote by

*eigenspace*of for the value .

Thus we allow arbitrary values
(not only eigenvalues)
in the definition of an eigenspace. The belongs to every eigenspace, though it is never an eigenvector. The linear subspace generated by an eigenvector is called an *eigenline*. We consider some easy examples over .

A linear mapping from to is the multiplication with a fixed number
(the *proportionality factor*).
Therefore, every number
is an
eigenvector
for the
eigenvalue
, and the
eigenspace
for this eigenvalue is the whole . Beside , there are no other eigenvalues, and all eigenspaces for
are .

A linear mapping from to is described by a -matrix with respect to the standard basis. We consider the eigenvalues for some elementary examples. A homothety is given as , with a scaling factor . Every vector is an eigenvector for the eigenvalue , and the eigenspace for this eigenvalue is the whole . Beside , there are no other eigenvalues, and all eigenspaces for are . The identity only has the eigenvalue .

The reflection at the -axis is described by the matrix . The eigenspace for the eigenvalue is the -axis, the eigenspace for the eigenvalue is the -axis. A vector with is not an eigenvector, since the equation

does not have a solution.

A plane rotation is described by a rotation matrix for the rotation angle , For , this is the identity, for , this is a half rotation, which is the reflection at the origin or the homothety with factor . For all other rotation angles, there is no line sent to itself, so that these rotations have no eigenvalue and no eigenvector (and all eigenspaces are ).

Let be a field, a -vector space and

a

linear mapping. Then the following statements hold.- Every
eigenspace
is a linear subspace of .

- is an eigenvalue for , if and only if the eigenspace is not the nullspace.
- A vector , is an eigenvector for , if and only if .

### Proof

For matrices, we use the same concepts. If
is a linear mapping, and is a describing matrix with respect to a basis, then for an eigenvalue and an eigenvector
with corresponding coordinate tuple with respect to the basis, we have the relation

The describing matrix with respect to another basis satisfies, due to to Lemma 25.8 , the relation , where is an invertible matrix. Let

denote the coordinate tuple with respect to the second basis. Then

i.e., the describing matrices have the same eigenvalues, but the coordinate tuples for the eigenvectors are different.

We consider the linear mapping

given by the diagonal matrix

The diagonal entries are the eigenvalues of , and the -th standard vector is a corresponding eigenvector. The eigenspaces are

These spaces are not if and only if equals one of the diagonal entries. The dimension of the eigenspace is given by the number how often the value occurs in the diagonal. The sum of all these dimension gives .

For an *orthogonal reflection* of , there exists an -dimensional linear subspace
,
which is fixed by the mapping and every vector orthogonal to is sent to its negative. If is a basis of and is a vector orthogonal to , then the reflection is described by the matrix

with respect to this basis.

We consider the linear mapping

given by the matrix

The question whether this mapping has eigenvalues, leads to the question whether there exists some , such that the equation

has a nontrivial solution . For a given , this is a linear problem and can be solved with the elimination algorithm. However, the question whether there exist eigenvalues at all, leads, due to the variable "eigenvalue parameter“ , to a nonlinear problem. The system of equations above is

For , we get , but the nullvector is not an eigenvector. Hence, suppose that . Both equations combined yield the condition

hence . But in , the number does not have a square root, therefore there is no solution, and that means that has no eigenvalues and no eigenvectors.

Now we consider the matrix as a real matrix, and look at the corresponding mapping

The same computations as above lead to the condition , and within the real numbers, we have the two solutions

For both values, we have now to find the eigenvectors. First, we consider the case , which yields the linear system

We write this as

and as

This system can be solved easily, the solution space has dimension one, and

is a basic solution.

For , we do the same steps, and the vector

is a basic solution. Thus over , the numbers and are eigenvalues, and the corresponding eigenspaces are

*Eigenspaces*

Let be a field, a -vector space and

a linear mapping. Then

eigenvalue of if and only if is not injective.

### Proof

More general, we have the following characterization.

Let . Then if and only if , and this is the case if and only if holds, which means .

Beside the
eigenspace
for
,
which is the
kernel
of the linear mapping, the eigenvalues
and
are in particular interesting. The eigenspace for consists of all vectors which are sent to themselves. Restricted to this linear subspace, the mapping is just the identity, it is called the *fixspace*. The eigenspace for consists in all vector which are sent to their negative. On this linear subspace, the mapping acts like the reflection at the origin.

Let be a field, a -vector space and

a linear mapping. Let be eigenvectors for (pairwise) different eigenvalues . Then are linearly independent.

We prove the statement by induction on . For , the statement is true. Suppose now that the statement is true for less than vectors. We consider a representation of , say

We apply to this and get, on one hand,

On the other hand, we multiply the equation with and get

We look at the difference of the two equations, and get

By the induction hypothesis, we get for the coefficients , . Because of , we get for , and because of , we also get .

Let be a field, a finite-dimensional -vector space and

a linear mapping. Then there exist at most many eigenvalues for .

### Proof

<< | Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I | >> PDF-version of this lecture Exercise sheet for this lecture (PDF) |
---|