From A First Course in Linear Algebra
Version 2.90
© 2004.
Licensed under the GNU Free Documentation License.
http://linear.ups.edu/
Summary Square matrix of size 5. Singular, nullity 2. 2 distinct eigenvalues, each of “high” multiplicity.
A matrix:
Matrix brought to reduced row-echelon form:
Analysis of the row-reduced matrix (Notation RREFA):
Matrix (coefficient matrix) is nonsingular or singular?
(Theorem NMRRI) at the same time, examine the size of the set
F
above.Notice that this property does not apply to matrices that are not
square.
Singular.
This is the null space of the matrix. The set of vectors used in the
span construction is a linearly independent set of column vectors that
spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve
the homogenous system with this matrix as the coefficient matrix and
write the solutions in vector form (Theorem VFSLS) to see these vectors
arise.
\left \langle \left \{\left [\array{
−1\cr
2
\cr
−2\cr
1
\cr
0 } \right ],\kern 1.95872pt \left [\array{
2\cr
−2
\cr
1\cr
0
\cr
1 } \right ]\right \}\right \rangle
Column space of the matrix, expressed as the span of a set of linearly independent
vectors that are also columns of the matrix. These columns have indices that form
the set D
above. (Theorem BCS)
\left \langle \left \{\left [\array{
−2\cr
−6
\cr
10\cr
−7
\cr
−4 } \right ],\kern 1.95872pt \left [\array{
−1\cr
−5
\cr
7\cr
−5
\cr
−3 } \right ],\kern 1.95872pt \left [\array{
−2\cr
−4
\cr
7\cr
−6
\cr
−4 } \right ]\right \}\right \rangle
The column space of the matrix, as it arises from the extended echelon form of the matrix.
The matrix L
is computed as described in Definition EEF. This is followed by the column space
described by a set of linearly independent vectors that span the null space of
L,
computed as according to Theorem FS and Theorem BNS. When
r = m, the matrix
L has no rows and the
column space is all of {ℂ}^{m}.
L = \left [\array{
1&0&−2&−6& 5\cr
0&1 & 4 & 10 &−9 } \right ]
\left \langle \left \{\left [\array{
−5\cr
9
\cr
0\cr
0
\cr
1 } \right ],\kern 1.95872pt \left [\array{
6\cr
−10
\cr
0\cr
1
\cr
0 } \right ],\kern 1.95872pt \left [\array{
2\cr
−4
\cr
1\cr
0
\cr
0 } \right ]\right \}\right \rangle
Column space of the matrix, expressed as the span of a set of linearly
independent vectors. These vectors are computed by row-reducing the
transpose of the matrix into reduced row-echelon form, tossing out the
zero rows, and writing the remaining nonzero rows as column vectors. By
Theorem CSRST and Theorem BRS, and in the style of Example CSROI,
this yields a linearly independent set of vectors that span the column
space.
\left \langle \left \{\left [\array{
1\cr
0
\cr
0
\cr
{9\over
4}
\cr
{5\over
2} } \right ],\kern 1.95872pt \left [\array{
0\cr
1
\cr
0
\cr
{5\over
4}
\cr
{3\over
2} } \right ],\kern 1.95872pt \left [\array{
0\cr
0
\cr
1
\cr
{1\over
2}
\cr
1 } \right ]\right \}\right \rangle
Row space of the matrix, expressed as a span of a set of linearly independent
vectors, obtained from the nonzero rows of the equivalent matrix in reduced
row-echelon form. (Theorem BRS)
\left \langle \left \{\left [\array{
1\cr
0
\cr
0\cr
1
\cr
−2 } \right ],\kern 1.95872pt \left [\array{
0\cr
1
\cr
0\cr
−2
\cr
2 } \right ],\kern 1.95872pt \left [\array{
0\cr
0
\cr
1\cr
2
\cr
−1 } \right ]\right \}\right \rangle
Inverse matrix, if it exists. The inverse is not defined for matrices that are not
square, and if the matrix is square, then the matrix must be nonsingular.
(Definition MI, Theorem NI)
Subspace dimensions associated with the matrix. (Definition NOM, Definition ROM) Verify Theorem RPNC
Determinant of the matrix, which is only defined for square matrices. The
matrix is nonsingular if and only if the determinant is nonzero (Theorem SMZD).
(Product of all eigenvalues?)
\text{Determinant } =\ 0
Eigenvalues, and bases for eigenspaces. (Definition EEM,Definition EM)
Geometric and algebraic multiplicities. (Definition GMEDefinition AME)
Diagonalizable? (Definition DZM)
Yes, full eigenspaces, Theorem DMFE.
The diagonalization. (Theorem DC)