A matrix is defined as a rectangular array of elements (usually the elements are real or complex numbers). An algebra of matrices is developed by defining addition of matrices, multiplication of matrices, multiplication of a matrix by a scalar (real or complex number), differentiation of matrices, etc. The definitions chosen for the above-mentioned operations will be such as to make the calculus of matrices highly applicable. A matrix may be denoted as follows:

If , we say that is a square matrix of order . If is the matrix of elements , then is said to be equal to , written or , if and only if for the complete range of values of and .

Two matrices can be compared for equality if and only if they are comparable in the sense that they have the same number of rows and the same number of columns.

The sum of two comparable matrices , is defined as a new matric whose elements are obtained by adding the corresponding elements of and . Thus

We note that .

We call a zero matrix if and only if each element of is equal to the real number zero.

The product of a matrix by a numbe (real or complex) is defined as the matrix whose elements are each times those of , that is

Every matrix can be associated with a negative matrix such that (zero matrix).

The rule for multiplying a matrix by a scalar should not be confused with the rule for multiplying a determinant by , for in this latter case the elements of only one row or only one column are multiplied by .

Before defining the product of two matrices let us consider the following sets of linear transformations:

Since the 's depend on the 's, which in turn depend on the 's, we can solve for the 's in terms of the 's. We write this transformation as follows:

This suggests a method for defining multiplication of the matrics , .

If , then is defined as the matrix such that

Let us note that the number of columns of the matrix must be equal the number of rows of . The matrix of (3) is an matrix. In the case of square matrices the definition for multiplication of matrices corresponds to that for multiplication of determinants. This implies that , where denotes the determinant of the set of elements comprising the square matrix .

A square matrix is said to be a symmetric matrix if and only if . If , we say the is a skew-symmetric matrix. We now exhibit a symmetric matrix and a skew-symmetric matrix .

We let the reader verify that is a symmetric matrix if is a square matrix. Let the reader first prove that ,

It is easily seen that is a symmetric matrix. Any square matrix can obviously be written as

Hence every square matrix can be written as the sum of a symmetric and a skew-symmetric matrix.

References

edit

[1] Lass, Harry. "Elements of pure and applied mathematics" New York: McGraw-Hill Companies, 1957.

This entry is a derivative of the Public domain work [1].