Linear subspaces/Sum/Section
For a -vector space and a family of linear subspaces , we define the sum of these linear subspaces by
This sum is again a linear subspace. In case
we say that is the sum of the linear subspaces . The following theorem describes an important relation between the dimension of the sum of two linear subspaces and the dimension of their intersection.
Let denote a field, and let denote a -vector space of finite dimension. Let denote linear subspaces. Then
Let be a basis of . On one hand, we can extend this basis, according to fact, to a basis of , on the other hand, we can extend it to a basis of . Then
is a generating system of . We claim that it is even a basis. To see this, let
This implies that the element
belongs to . From this, we get directly for , and for . From the equation before, we can then infer that also holds for all . Hence, we have linear independence. This gives altogether
The intersection of two planes
(through the origin)
in is "usually“ a line; it is the plane itself if the same plane is taken twice, but it is never just a point. This observation is generalized in the following statement.
Let be a field, and let be a -vector space of dimension . Let denote linear subspaces of dimensions and . Then
Due to fact, we have
Recall that, for a linear subspace
,
the difference is called the codimension of in . With this concept, we can paraphrase the statement above by saying that the codimension of an intersection of linear subspaces equals at most the sum of their codimensions.
Let a homogeneous system of linear equations with equations in variables be given. Then the dimension
of the solution space of the system is at least .The solution space of one linear equation in variables has dimension or . The solution space of the system is the intersection of the solution spaces of the individual equations. Therefore, the statement follows by applying fact to the individual solution spaces.