Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 7/latex
\setcounter{section}{7}
\subtitle {Linear independence}
\inputdefinition
{ }
{
Let $K$ be a
field,
and let $V$ be a
$K$-vector space.
A family of vectors
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,}
\extrabracket {where $I$ denotes a finite index set} {} {}
is called \definitionword {linearly independent}{} if an equation of the form
\mathdisp {\sum_{i \in I} s_i v_i =0 \text{ with } s_i \in K} { }
is only possible when
\mathrelationchain
{\relationchain
{ s_i
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
}
If a family is not
linearly independent,
then it is called \keyword {linearly dependent} {.} A linear combination
\mathrelationchain
{\relationchain
{ \sum_{i \in I} s_i v_i
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
is called a \keyword { representation of the null vector} {.} It is called the \keyword {trivial representation} {} if all coefficients $s_i$ equal $0$ and, if at least one coefficient is not $0$, a \keyword {nontrivial representation of the null vector} {.} A family of vectors is linearly independent if and only if one can represent with the family the null vector only in the trivial way. This is equivalent with the property that no vector of the family can be expressed as a linear combination by the others.
\inputexample{}
{
The
standard vectors
in $K^n$ are
linearly independent.
A representation
\mathrelationchaindisplay
{\relationchain
{\sum_{i = 1}^n s_i e_i
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
just means
\mathrelationchaindisplay
{\relationchain
{ s_1 \begin{pmatrix} 1 \\0\\ \vdots\\0 \end{pmatrix} + s_2 \begin{pmatrix} 0 \\1\\ \vdots\\0 \end{pmatrix} + \cdots + s_n \begin{pmatrix} 0 \\0\\ \vdots\\1 \end{pmatrix}
}
{ =} { \begin{pmatrix} 0 \\0\\ \vdots\\0 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
The $i$-th row yields directly
\mathrelationchain
{\relationchain
{ s_i
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
}
\inputexample{}
{
The three vectors
\mathlistdisplay {\begin{pmatrix} 3 \\3\\ 3 \end{pmatrix}} {} {\begin{pmatrix} 0 \\4\\ 5 \end{pmatrix},} {and} {\begin{pmatrix} 4 \\8\\ 9 \end{pmatrix}} {}
are
linearly dependent.
The equation
\mathrelationchaindisplay
{\relationchain
{ 4 \begin{pmatrix} 3 \\3\\ 3 \end{pmatrix} + 3 \begin{pmatrix} 0 \\4\\ 5 \end{pmatrix} -3 \begin{pmatrix} 4 \\8\\ 9 \end{pmatrix}
}
{ =} { \begin{pmatrix} 0 \\0\\ 0 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is a nontrivial representation of the null vector.
}
\inputremark {}
{
The vectors \mathl{v_1 = \begin{pmatrix} a_{11} \\\vdots\\ a_{m1} \end{pmatrix} , \ldots , v_n = \begin{pmatrix} a_{1n} \\\vdots\\ a_{mn} \end{pmatrix} \in K^m}{} are
linearly dependent
if and only if the
homogeneous linear system
\mathdisp {\begin{matrix} a _{ 1 1 } x _1 + a _{ 1 2 } x _2 + \cdots + a _{ 1 n } x _{ n } & = & 0 \\ a _{ 2 1 } x _1 + a _{ 2 2 } x _2 + \cdots + a _{ 2 n } x _{ n } & = & 0 \\ \vdots & \vdots & \vdots \\ a _{ m 1 } x _1 + a _{ m 2 } x _2 + \cdots + a _{ m n } x _{ n } & = & 0 \end{matrix}} { }
has a nontrivial solution.
}
For an infinite family of vectors, there is also a concept of linear independence.
\inputdefinition
{ }
{
Let $K$ be a
field,
and let $V$ be a
$K$-vector space.
Then a family of vectors
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,}
in $V$ is called \definitionword {linearly independent}{} if an equation
\mathdisp {\sum_{i \in J} s_i v_i =0 \text{ with } s_i \in K \text{ for a finite subset } J \subseteq I} { }
is only possible if
\mathrelationchain
{\relationchain
{ s_i
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
}
By this definition, linear independence of an arbitrary family is reduced to the finite case. Keep in mind that in a vector space, there are no infinite sums. An expression like
\mathrelationchain
{\relationchain
{ \sum_{n \in \N} s_n v_n
}
{ = }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
does not have any sense.
\inputfactproof
{Linearly independent/Simple properties/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
let $V$ be a
$K$-vector space,
and let
\mathcond {v_{ i }} {}
{i \in I} {}
{} {} {} {,}
be a family of vectors in $V$.}
\factsegue {Then the following statements hold.}
\factconclusion {\enumerationsix {If the family is linearly independent, then for each subset
\mathrelationchain
{\relationchain
{ J
}
{ \subseteq }{ I
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
also the family
\mathcond {v_i} {,}
{i \in J} {}
{} {} {} {,}
is linearly independent.
} {The empty family is linearly independent.
} {If the family contains the null vector, then it is not linearly independent.
} {If a vector appears several times in the family, then the family is not linearly independent.
} {A single vector $v$ is linearly independent if and only if
\mathrelationchain
{\relationchain
{ v
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
} {Two vectors
\mathcor {} {v} {and} {u} {}
are linearly independent if and only if $u$ is not a scalar multiple of $v$ and vice versa.
}}
\factextra {}
{See Exercise 7.14 .}
\subtitle {Bases}
\inputdefinition
{ }
{
Let $K$ be a
field,
and let $V$ be a
$K$-vector space.
Then a
linearly independent
generating system
\mathcond {v_i \in V} {}
{i \in I} {}
{} {} {} {,}
}
\inputexample{}
{
The
standard vectors
in $K^n$ form a
basis.
The
linear independence
was shown in
Example 7.2
.
To show that they also form a
generating system,
let
\mathrelationchaindisplay
{\relationchain
{v
}
{ =} { \begin{pmatrix} b_1 \\b_2\\ \vdots\\b_n \end{pmatrix}
}
{ \in} { K^n
}
{ } {
}
{ } {
}
}
{}{}{}
be an arbitrary vector. Then we have immediately
\mathrelationchaindisplay
{\relationchain
{v
}
{ =} { \sum_{i = 1}^n b_i e_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
Hence, we have a basis, which is called the \keyword {standard basis} {} of $K^n$.
}
\inputexample{}
{
We consider the
$K$-linear subspace
\mathrelationchain
{\relationchain
{U
}
{ \subset }{K^n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
given by
\mathrelationchaindisplay
{\relationchain
{U
}
{ =} { { \left\{ v \in K^n \mid \sum_{i =1 }^n v_i = 0 \right\} }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
A
basis
for $U$ is given by the \mathl{n-1}{} vectors
\mathdisp {u_1=(1,-1,0 ,0 , \ldots , 0),\, u_2 = (0, 1,-1, 0 , \ldots , 0),\, , \ldots , u_{n-1} = (0 ,0 , \ldots , 0,1,-1),\,} { . }
These vectors belong evidently to $U$. The
linear independence
can be checked in $K^n$. From an equation
\mathrelationchaindisplay
{\relationchain
{\sum_{i = 1}^{n-1} a_iu_i
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
we can deduce step by step
\mathrelationchain
{\relationchain
{ a_1
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
\mathrelationchain
{\relationchain
{ a_2
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
etc. That the system is a
generating system
follows from
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} v_1 \\v_2\\ v_3\\\vdots\\ v_{n-1}\\ v_n \end{pmatrix}
}
{ =} { v_1 \begin{pmatrix} 1 \\-1\\ 0\\\vdots\\ 0\\ 0 \end{pmatrix} + (v_1+v_2) \begin{pmatrix} 0 \\1\\ -1\\\vdots\\ 0\\ 0 \end{pmatrix} + (v_1+v_2+v_3) \begin{pmatrix} 0 \\0\\ 1\\-1\\ \vdots\\ 0 \end{pmatrix} + \cdots + (v_1+v_2+v_3 + \cdots + v_{n-1} ) \begin{pmatrix} 0 \\0\\ 0\\\vdots\\ 1\\ -1 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
where the equality in the last row rests on the condition
\mathrelationchaindisplay
{\relationchain
{ \sum_{i = 1 }^n v_i
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
}
For the complex numbers, the elements \mathl{1, { \mathrm i}}{} form a real basis. In the space of all $m \times n$-matrices, that is, \mathl{\operatorname{Mat}_{ m \times n } (K)}{,} those matrices where exactly one entry is $1$, and all other entries are $0$, form a basis, see Exercise 7.27 .
\inputexample{}
{
In the polynomial ring \mathl{K[X]}{} over a field $K$, the powers
\mathcond {X^n} {}
{n \in \N} {}
{} {} {} {,}
form a
basis.
By definition, every polynomial
\mathdisp {a_nX^n+a_{n-1}X^{n-1} + \cdots + a_2X^2+a_1X+a_0} { }
is a
linear combination
of the powers \mathl{X^0=1,X^1=X , \ldots , X^n}{.} Moreover, these powers are
linearly independent.
For if
\mathrelationchaindisplay
{\relationchain
{ a_nX^n+a_{n-1}X^{n-1} + \cdots + a_2X^2+a_1X+a_0
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
then all coefficients equal $0$
\extrabracket {this is part of the concept of a polynomial} {} {.}
}
\subtitle {Characterization of a basis}
The following theorem gives an important characterization for a family of vectors to be basis.
\inputfactproof
{Vector space/Characterizations of basis/Maximal/Minimal/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ be a
field,
and let $V$ be a
$K$-vector space.
Let
\mathrelationchain
{\relationchain
{ v_1 , \ldots , v_n
}
{ \in }{ V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
be a family of vectors.}
\factsegue {Then the following statements are equivalent.}
\factconclusion {\enumerationfour {The family is a
basis
of $V$.
} {The family is a minimal
generating system;
that is, as soon as we remove one vector $v_i$, the remaining family is not a generating system any more.
} {For every vector
\mathrelationchain
{\relationchain
{u
}
{ \in }{V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
there is exactly one representation
\mathrelationchaindisplay
{\relationchain
{u
}
{ =} { s_1 v_1 + \cdots + s_n v_n
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
} {The family is maximally
linearly independent;
that is, as soon as some vector is added, the family is not linearly independent any more.
}}
\factextra {}
}
{
Proof by ring closure. $(1) \Rightarrow (2)$. The family is a generating system. Let us remove a vector, say $v_1$, from the family. We have to show that the remaining family, that is \mathl{v_2 , \ldots , v_n}{,} is not a generating system anymore. So suppose that it is still a generating system. Then, in particular, $v_1$ can be written as a
linear combination
of the remaining vectors, and we have
\mathrelationchaindisplay
{\relationchain
{ v_1
}
{ =} { \sum_{i = 2}^n s_i v_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
But then
\mathrelationchaindisplay
{\relationchain
{ v_1- \sum_{i = 2}^n s_i v_i
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is a nontrivial representation of $0$, contradicting the linear independence of the family.
$(2) \Rightarrow (3)$. Due to the condition, the family is a generating system, hence every vector can be represented as a linear combination.
Suppose that for some
\mathrelationchain
{\relationchain
{ u
}
{ \in }{V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
there is more than one representation, say
\mathrelationchaindisplay
{\relationchain
{ u
}
{ =} { \sum_{i = 1}^n s_i v_i
}
{ =} { \sum_{i = 1}^n t_i v_i
}
{ } {
}
{ } {
}
}
{}{}{,}
where at least one coefficient is different. Without loss of generality, we may assume
\mathrelationchain
{\relationchain
{ s_1
}
{ \neq }{ t_1
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Then we get the relation
\mathrelationchaindisplay
{\relationchain
{ { \left( s_1 - t_1 \right) } v_1
}
{ =} { \sum_{i = 2}^n { \left( t_i- s_i \right) } v_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
Because of
\mathrelationchain
{\relationchain
{ s_1 - t_1
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we can divide by this number and obtain a representation of $v_1$ using the other vectors. In this situation, due to
Exercise 6.25
,
also the family without $v_1$ is a generating system of $V$, contradicting the minimality.
$(3) \Rightarrow (4)$. Because of the unique representability, the zero vector has only the trivial representation. This means that the vectors are
linearly independent.
If we add a vector $u$, then it has a representation
\mathrelationchaindisplay
{\relationchain
{ u
}
{ =} { \sum_{i = 1}^n s_i v_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
and, therefore,
\mathrelationchaindisplay
{\relationchain
{ 0
}
{ =} { u- \sum_{i = 1}^n s_i v_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is a non-trivial representation of $0$, so that the extended family \mathl{u,v_1 , \ldots , v_n}{} is not linearly independent.
$(4) \Rightarrow (1)$. The family is linearly independent, we have to show that it is also a generating system. Let
\mathrelationchain
{\relationchain
{ u
}
{ \in }{V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Due to the condition, the family \mathl{u,v_1 , \ldots , v_n}{} is not linearly independent. This means that there exists a non-trivial representation
\mathrelationchaindisplay
{\relationchain
{ 0
}
{ =} { s u + \sum_{i = 1}^n s_iv_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
Here
\mathrelationchain
{\relationchain
{ s
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
because otherwise this would be a non-trivial representation of $0$ with the original family \mathl{v_1 , \ldots , v_n}{.} Hence, we can write
\mathrelationchaindisplay
{\relationchain
{ u
}
{ =} { - \sum_{i = 1}^n \frac{ s_i}{ s } v_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
yielding a representation for $u$.
\inputremark {{{{2}}}}
{
Let a
basis
\mathrelationchain
{\relationchain
{ \mathfrak{ v }
}
{ = }{ v_1 , \ldots , v_n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
of a
$K$-vector space
$V$ be given. Due to
Theorem 7.11
(3),
this means that for every vector
\mathrelationchain
{\relationchain
{ u
}
{ \in }{ V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
there exists a unique representation
\extrabracket {a
linear combination} {} {}
\mathrelationchaindisplay
{\relationchain
{u
}
{ =} { s_1 v_1 + s_2 v_2 + \cdots + s_n v_n
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
Here, the uniquely determined elements
\mathrelationchain
{\relationchain
{ s_i
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
\extrabracket {scalars} {} {}
are called the \keyword {coordinates} {} of $u$ with respect to the given basis. This means that for a given basis, there is a correspondence between vectors and coordinate tuples
\mathrelationchain
{\relationchain
{ (s_1,s_2 , \ldots , s_n)
}
{ \in }{ K^n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
We say that a basis determines a \keyword {linear coordinate system} {\extrafootnote {Linear coordinates give a bijective relation between points and number tuples. Due to linearity, such a bijection respects addition and scalar multiplication. In many different contexts, also nonlinear
\extrabracket {curvilinear} {} {}
coordinates are important. These put points of a space and number tuples into a bijective relation. Examples are polar coordinates, cylindrical coordinates, and spherical coordinates. By choosing suitable coordinates, mathematical problems, like the computation of volumes, can be simplified.} {} {}} of $V$. To paraphrase, a basis gives, in particular, a bijective mapping
\mathdisp {\Psi_ \mathfrak{ v } \colon K^n \longrightarrow V
, \begin{pmatrix} s_1 \\\vdots\\ s_n \end{pmatrix} \longmapsto s_1 v_1 + s_2 v_2 + \cdots + s_n v_n} { . }
The
inverse mapping
\mathdisp {{ \left( \Psi_ \mathfrak{ v } \right) }^{-1} \colon V \longrightarrow K^n} { }
is also called the \keyword {coordinate mapping} {.}
}
\inputfactproof
{vector space/Finitely generated/Basis/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ be a
field,
and let $V$ be a
$K$-vector space
with a finite
generating system.}
\factconclusion {Then $V$ has a finite
basis.}
\factextra {}
}
{
Let
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,}
be a finite generating system of $V$ with a
finite
index set $I$. We argue with the characterization from
Theorem 7.11
(2).
If the family is minimal, then we have a basis. If not, then there exists some
\mathrelationchain
{\relationchain
{k
}
{ \in }{I
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
such that the remaining family, where $v_k$ is removed, that is,
\mathcond {v_i} {}
{i \in I \setminus \{k\}} {}
{} {} {} {,}
is also a generating system. In this case, we can go on with this smaller index set. With this method, we arrive at a subset
\mathrelationchain
{\relationchain
{J
}
{ \subseteq }{I
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
such that
\mathcond {v_i} {}
{i \in J} {}
{} {} {} {,}
is a minimal generating set, hence a basis.
\inputremark {}
{
In general, the Theorem of Hamel says that every vector space has a basis. The proof of this theorem uses strong set-theoretical methods, in particular the axiom of choice and the Lemma of Zorn. This is the reason why many statements about finite-dimensional vector spaces pass over to vector spaces of infinite dimension. In this course, we will concentrate on the finite-dimensional case.
}