Linear algebra/Matrices

MatrixEdit

In mathematics, a matrix (plural matrices) is a rectangular array

 

The individual items in an m × n matrix A, often denoted by ai,j, where max i = m and max j = n, are called its elements or entries.Provided that they have the same size (each matrix has the same number of rows and the same number of columns as the other)

Two matrices can be added or subtracted element by element (see Conformable matrix). The rule for matrix multiplication, however, is that two matrices can be multiplied only when the number of columns in the first equals the number of rows in the second (i.e., the inner dimensions are the same, n for Am,n × Bn,p). Any matrix can be multiplied element-wise by a scalar from its associated field

A major application of matrices is to represent linear transformations, that is, generalizations of linear functions such as f(x) = 4x. For example, the rotation of vectors in three-dimensional space is a linear transformation, which can be represented by a rotation matrix R: if v is a column vector (a matrix with only one column) describing the position of a point in space, the product Rv is a column vector describing the position of that point after a rotation. The product of two transformation matrices is a matrix that represents the composition of two linear transformations. Another application of matrices is in the solution of systems of linear equations. If the matrix is square, it is possible to deduce some of its properties by computing its determinant. For example, a square matrix has an inverse if and only if its determinant is not zero. Insight into the geometry of a linear transformation is obtainable (along with other information) from the matrix's eigenvalues and eigenvectors.

Types of MatricesEdit

Common types of matrices that we encounter in finite elements are:


  • a row vector that has one row and   columns.
 
  • a column vector that has   rows and one column.
 
  • a square matrix that has an equal number of rows and columns.
  • a diagonal matrix which is a square matrix with only the

diagonal elements ( ) nonzero.

 
  • the identity matrix ( ) which is a diagonal matrix and

with each of its nonzero elements ( ) equal to 1.

 
  • a symmetric matrix which is a square matrix with elements

such that  .

 
  • a skew-symmetric matrix which is a square matrix with elements

such that  .

 

Note that the diagonal elements of a skew-symmetric matrix have to be zero:  .

Matrix OperationsEdit

Determinant of a matrixEdit

The determinant of a matrix is defined only for square matrices.

For a   matrix  , we have

 

For a   matrix, the determinant is calculated by expanding into minors as

 

In short, the determinant of a matrix   has the value

 

where   is the determinant of the submatrix of   formed by eliminating row   and column   from  .

Some useful identities involving the determinant are given below.


  • If   is a   matrix, then
 
  • If   is a constant and   is a   matrix, then
 
  • If   and   are two   matrices, then
 

If you think you understand determinants, take the quiz.

Matrix additionEdit

Let   and   be two   matrices with components   and  , respectively. Then

 

Matrix MultiplicationEdit

Multiplication by a scalarEdit

Let   be a   matrix with components   and let   be a scalar quantity. Then,

 

Multiplication of matricesEdit

Let   be a   matrix with components  . Let   be a   matrix with components  .

The product   is defined only if  . The matrix   is a   matrix with components  . Thus,

 

Similarly, the product   is defined only if  . The matrix   is a   matrix with components  . We have

 

Clearly,   in general, i.e., the matrix product is not commutative.

However, matrix multiplication is distributive. That means

 

The product is also associative. That means

 

Transpose of a matrixEdit

Let   be a   matrix with components  . Then the transpose of the matrix is defined as the   matrix   with components  . That is,

 

An important identity involving the transpose of matrices is

 

Inverse of a matrixEdit

Let   be a   matrix. The inverse of   is denoted by   and is defined such that

 

where   is the   identity matrix.

The inverse exists only if  . A singular matrix does not have an inverse.

An important identity involving the inverse is

 

since this leads to:  

Some other identities involving the inverse of a matrix are given below.


  • The determinant of a matrix is equal to the multiplicative inverse of the

determinant of its inverse.

 
  • The determinant of a similarity transformation of a matrix

is equal to the original matrix.

 

We usually use numerical methods such as Gaussian elimination to compute the inverse of a matrix.

Eigenvalues and eigenvectorsEdit

A thorough explanation of this material can be found at Eigenvalue, eigenvector and eigenspace. However, for further study, let us consider the following examples:

  • Let : 

Which vector is an eigenvector for   ?

We have   , and  

Thus,   is an eigenvector.

  • Is   an eigenvector for   ?

We have that since   ,   is not an eigenvector for  

ReferenceEdit