The rank of a matrix is the number of linearly independent rows (or columns) in it; thus,
. A square
matrix all of whose off-diagonal entries are zero is called a diagonal matrix; its rank is equal to the number of non-zero diagonal entries. If all
diagonal entries of such a diagonal matrix are
, it is called the identity matrix of dimension
and represented by
.
For a square
matrix
and a vector
that is not all zeros, the values of
satisfying
The eigenvalues of a matrix are found by solving the
characteristic equation, which is obtained by
rewriting Equation 213 in the form
. The eigenvalues of
are then the solutions of
, where
denotes the determinant of a square matrix
.
The equation
is an
th order polynomial equation in
and can have at most
roots, which are the
eigenvalues of
. These eigenvalues can in general be complex, even if all entries of
are real.
We now examine some further properties of eigenvalues and eigenvectors, to set up the central idea of singular value decompositions in Section 18.2 below. First, we look at the relationship between matrix-vector multiplication and eigenvalues.
Worked example.
Consider the matrix
![]() |
(215) |
![]() |
(216) |
![]() |
(217) |
Example 18.1 shows that even though is an arbitrary vector, the effect of multiplication by
is determined by the eigenvalues and eigenvectors of
. Furthermore, it is intuitively apparent from Equation 221 that the product
is relatively unaffected by terms arising from the small eigenvalues of
; in our example, since
, the contribution of the third term on the right hand side of Equation 221 is small. In fact, if we were to completely ignore the contribution in Equation 221 from the third eigenvector corresponding to
, then the product
would be computed to be
rather than the correct product which is
; these two vectors are relatively close to each other by any of various metrics one could apply (such as the length of their vector difference).
This suggests that the effect of small eigenvalues (and their eigenvectors) on a matrix-vector product is small. We will carry forward this intuition when studying matrix decompositions and low-rank approximations in Section 18.2 . Before doing so, we examine the eigenvectors and eigenvalues of special forms of matrices that will be of particular interest to us.
For a symmetric matrix , the eigenvectors corresponding to distinct eigenvalues are orthogonal. Further, if
is both real and symmetric, the eigenvalues are all real.
Worked example.
Consider the real, symmetric matrix