SlideShare une entreprise Scribd logo
1  sur  3
Télécharger pour lire hors ligne
The Rank of a Matrix
Francis J. Narcowich
Department of Mathematics
Texas A&M University
January 2005
1 Rank and Solutions to Linear Systems
The rank of a matrix A is the number of leading entries in a row reduced form
R for A. This also equals the number of nonrzero rows in R. For any system
with A as a coefficient matrix, rank[A] is the number of leading variables. Now,
two systems of equations are equivalent if they have exactly the same solution
set. When we discussed the row-reduction algorithm, we also mentioned that
row-equivalent augmented matrices correspond to equivalent systems:
Theorem 1.1 If [A|b] and [A |b ] are augmented matrices for two linear systems
of equations, and if [A|b] and [A |b ] are row equivalent, then the corresponding
linear systems are equivalent.
By examining the possible row-reduced matrices corresponding to the aug-
mented matrix, one can use Theorem 1.1 to obtain the following result, which we
state without proof.
Theorem 1.2 Consider the system Ax = b, with coefficient matrix A and aug-
mented matrix [A|b]. As above, the sizes of b, A, and [A|b] are m × 1, m × n, and
m × (n + 1), respectively; in addition, the number of unknowns is n. Below, we
summarize the possibilities for solving the system.
i. Ax = b is inconsistent (i.e., no solution exists) if and only if rank[A] <
rank[A|b].
ii. Ax = b has a unique solution if and only if rank[A] = rank[A|b] = n .
iii. Ax = b has infinitely many solutions if and only if rank[A] = rank[A|b] < n.
1
To illustrate this theorem, let’s look at the simple systems below.
x1 + 2x2 = 1
3x1 + x2 = −2
3x1 + 2x2 = 3
−6x1 − 4x2 = 0
3x1 + 2x2 = 3
−6x1 − 4x2 = −6
The augmented matrices for these systems are, respectively,
1 2 1
3 1 −2
3 2 3
−6 −4 0
3 2 3
−6 −4 −6
.
Applying the row-reduction algorithm yields the row-reduced form of each of these
augmented matrices. The results are, again respectively,
1 0 −1
0 1 1
1 2
3 0
0 0 1
1 2
3 1
0 0 0
.
From each of these row-reduced versions of the augmented matrices, one can read
off the rank of the coefficient matrix as well as the rank of the augmented matrix.
Applying Theorem 1.2 to each of these tells us the number of solutions to expect
for each of the corresponding systems. We summarize our findings in the table
below.
System rank[A] rank[A|b] n # of solutions
First 2 2 2 1
Second 1 2 2 0 (inconsistent)
Third 1 1 2 ∞
Homogeneous systems. A homogeneous system is one in which the vector
b = 0. By simply plugging x = 0 into the equation Ax = 0, we see that every
homogeneous system has at least one solution, the trivial solution x = 0. Are
there any others? Theorem 1.2 provides the answer.
Corollary 1.3 Let A be an m × n matrix. A homogeneous system of equations
Ax = 0 will have a unique solution, the trivial solution x = 0, if and only if
rank[A] = n. In all other cases, it will have infinitely many solutions. As a
consequence, if n > m—i.e., if the number of unknowns is larger than the number
of equations—, then the system will have infinitely many solutions.
Proof: Since x = 0 is always a solution, case (i) of Theorem 1.2 is eliminated
as a possibility. Therefore, we must always have rank[A] = rank[A|0] ≤ n. By
Theorem 1.2, case (ii), equality will hold if and only if x = 0 is the only solution.
When it does not hold, we are always in case (iii) of Theorem 1.2; there are thus
infinitely many solutions for the system. If n > m, then we need only note that
rank[A] ≤ m < n to see that the system has to have infinitely many solutions. 2
2
2 Linear Independence and Dependence
A set of k vectors {u1, u2, . . . , uk} is linearly independent (LI) if the equation
k
j=1
cjuj = 0,
where the cj’s are scalars, has only c1 = c2 = · · · = ck = 0 as a solution. Otherwise,
the vectors are linearly dependent (LD). Let’s assume the vectors are all m × 1
column vectors. If they are rows, just transpose them. Now, use the basic matrix
trick (BMT) to put the equation in matrix form:
Ac =
k
j=1
cjuj = 0, where A = [u1 u2 · · · uk] and c = (c1 c2 · · · ck)T
.
The question of whether the vectors are LI or LD is now a question of whether the
homogeneous system Ac = 0 has a nontrivial solution. Combining this observation
with Corollary 1.3 gives us a handy way to check for LI/LD by finding the rank of
a matrix.
Corollary 2.1 A set of k column vectors {u1, u2, . . . , uk} is linearly independent
if and only if the associated matrix A = [u1 u2 · · · uk] has rank[A] = k.
Example 2.2 Determine whether the vectors below are linearly independent or
linearly dependent.
u1 =




1
−1
2
1



 , u2 =




2
1
1
−1



 , u3 =




0
1
−1
−1



 .
Solution. Form the associated matrix A and perform row reduction on it:
A =




1 2 0
−1 1 1
2 1 −1
1 −1 −1



 ⇐⇒




1 0 −2
3
0 1 1
3
0 0 0
0 0 0



 .
The rank of A is 2 < k = 3, so the vectors are LD. The row-reduced form of
A gives us additional information; namely, we can read off the cj’s for which
k
j=1 cjuj = 0. We have c1 = 2
3 t, c2 = −1
3 t, and c3 = t. As usual, t is a
parameter.
3

Contenu connexe

Tendances

Chapter 4: Linear Algebraic Equations
Chapter 4: Linear Algebraic EquationsChapter 4: Linear Algebraic Equations
Chapter 4: Linear Algebraic Equations
Maria Fernanda
 
Solving systems of Linear Equations
Solving systems of Linear EquationsSolving systems of Linear Equations
Solving systems of Linear Equations
swartzje
 
Direct Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations SystemsDirect Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations Systems
Lizeth Paola Barrero
 
Iterativos Methods
Iterativos MethodsIterativos Methods
Iterativos Methods
Jeannie
 
System of equations
System of equationsSystem of equations
System of equations
mariacadena
 

Tendances (18)

Integrated Math 2 Section 8-5
Integrated Math 2 Section 8-5Integrated Math 2 Section 8-5
Integrated Math 2 Section 8-5
 
Matrices and determinants
Matrices and determinantsMatrices and determinants
Matrices and determinants
 
Matrices ppt
Matrices pptMatrices ppt
Matrices ppt
 
Chapter 4: Linear Algebraic Equations
Chapter 4: Linear Algebraic EquationsChapter 4: Linear Algebraic Equations
Chapter 4: Linear Algebraic Equations
 
Solving systems of Linear Equations
Solving systems of Linear EquationsSolving systems of Linear Equations
Solving systems of Linear Equations
 
Linear equations
Linear equationsLinear equations
Linear equations
 
Null space and rank nullity theorem
Null space and rank nullity theoremNull space and rank nullity theorem
Null space and rank nullity theorem
 
Matrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptMatrices and System of Linear Equations ppt
Matrices and System of Linear Equations ppt
 
Direct Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations SystemsDirect Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations Systems
 
Rank of a matrix
Rank of a matrixRank of a matrix
Rank of a matrix
 
Iterativos Methods
Iterativos MethodsIterativos Methods
Iterativos Methods
 
System of equations
System of equationsSystem of equations
System of equations
 
mws_gen_sle_ppt_eigenvalues.pptx
mws_gen_sle_ppt_eigenvalues.pptxmws_gen_sle_ppt_eigenvalues.pptx
mws_gen_sle_ppt_eigenvalues.pptx
 
Crout s method for solving system of linear equations
Crout s method for solving system of linear equationsCrout s method for solving system of linear equations
Crout s method for solving system of linear equations
 
Introduction of matrices
Introduction of matricesIntroduction of matrices
Introduction of matrices
 
Matrices and determinants
Matrices and determinantsMatrices and determinants
Matrices and determinants
 
System Of Linear Equations
System Of Linear EquationsSystem Of Linear Equations
System Of Linear Equations
 
Presentation on matrix
Presentation on matrixPresentation on matrix
Presentation on matrix
 

En vedette

Notes on the low rank matrix approximation of kernel
Notes on the low rank matrix approximation of kernelNotes on the low rank matrix approximation of kernel
Notes on the low rank matrix approximation of kernel
Hiroshi Tsukahara
 
Vector calculus
Vector calculusVector calculus
Vector calculus
raghu ram
 

En vedette (7)

Notes on the low rank matrix approximation of kernel
Notes on the low rank matrix approximation of kernelNotes on the low rank matrix approximation of kernel
Notes on the low rank matrix approximation of kernel
 
Vector calculus 1st 2
Vector calculus 1st 2Vector calculus 1st 2
Vector calculus 1st 2
 
Vector calculus
Vector calculusVector calculus
Vector calculus
 
Design of overhead RCC rectangular water tank
Design of overhead RCC rectangular water tankDesign of overhead RCC rectangular water tank
Design of overhead RCC rectangular water tank
 
rank of matrix
rank of matrixrank of matrix
rank of matrix
 
10 Ways Your Boss Kills Employee Motivation
10 Ways Your Boss Kills Employee Motivation10 Ways Your Boss Kills Employee Motivation
10 Ways Your Boss Kills Employee Motivation
 
Build Features, Not Apps
Build Features, Not AppsBuild Features, Not Apps
Build Features, Not Apps
 

Similaire à Rankmatrix

Linear equations rev
Linear equations revLinear equations rev
Linear equations rev
Yash Jain
 
System of equations
System of equationsSystem of equations
System of equations
mariacadena
 
System of equations
System of equationsSystem of equations
System of equations
mariacadena
 
System of equations
System of equationsSystem of equations
System of equations
mariacadena
 
System of equations
System of equationsSystem of equations
System of equations
mariacadena
 
Pair of linear equations
Pair of linear equationsPair of linear equations
Pair of linear equations
Yash Jain
 
Linear equations rev - copy
Linear equations rev - copyLinear equations rev - copy
Linear equations rev - copy
Yash Jain
 
0.3.e,ine,det.
0.3.e,ine,det.0.3.e,ine,det.
0.3.e,ine,det.
m2699
 
Direct Methods to Solve Lineal Equations
Direct Methods to Solve Lineal EquationsDirect Methods to Solve Lineal Equations
Direct Methods to Solve Lineal Equations
Lizeth Paola Barrero
 

Similaire à Rankmatrix (20)

01.02 linear equations
01.02 linear equations01.02 linear equations
01.02 linear equations
 
CONSISTENCY CRITERIA
CONSISTENCY CRITERIA CONSISTENCY CRITERIA
CONSISTENCY CRITERIA
 
Module 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfModule 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdf
 
Linear Algebra and its use in finance:
Linear Algebra and its use in finance:Linear Algebra and its use in finance:
Linear Algebra and its use in finance:
 
Chapter 3: Linear Systems and Matrices - Part 1/Slides
Chapter 3: Linear Systems and Matrices - Part 1/SlidesChapter 3: Linear Systems and Matrices - Part 1/Slides
Chapter 3: Linear Systems and Matrices - Part 1/Slides
 
system of linear equations by Diler
system of linear equations by Dilersystem of linear equations by Diler
system of linear equations by Diler
 
Linear Algebra
Linear AlgebraLinear Algebra
Linear Algebra
 
Linear equations rev
Linear equations revLinear equations rev
Linear equations rev
 
System of equations
System of equationsSystem of equations
System of equations
 
System of equations
System of equationsSystem of equations
System of equations
 
System of equations
System of equationsSystem of equations
System of equations
 
System of equations
System of equationsSystem of equations
System of equations
 
Pair of linear equations
Pair of linear equationsPair of linear equations
Pair of linear equations
 
Linear equations rev - copy
Linear equations rev - copyLinear equations rev - copy
Linear equations rev - copy
 
Matrix_PPT.pptx
Matrix_PPT.pptxMatrix_PPT.pptx
Matrix_PPT.pptx
 
Eigen value and vectors
Eigen value and vectorsEigen value and vectors
Eigen value and vectors
 
0.3.e,ine,det.
0.3.e,ine,det.0.3.e,ine,det.
0.3.e,ine,det.
 
Direct Methods to Solve Lineal Equations
Direct Methods to Solve Lineal EquationsDirect Methods to Solve Lineal Equations
Direct Methods to Solve Lineal Equations
 
Direct methods
Direct methodsDirect methods
Direct methods
 
Direct methods
Direct methodsDirect methods
Direct methods
 

Rankmatrix

  • 1. The Rank of a Matrix Francis J. Narcowich Department of Mathematics Texas A&M University January 2005 1 Rank and Solutions to Linear Systems The rank of a matrix A is the number of leading entries in a row reduced form R for A. This also equals the number of nonrzero rows in R. For any system with A as a coefficient matrix, rank[A] is the number of leading variables. Now, two systems of equations are equivalent if they have exactly the same solution set. When we discussed the row-reduction algorithm, we also mentioned that row-equivalent augmented matrices correspond to equivalent systems: Theorem 1.1 If [A|b] and [A |b ] are augmented matrices for two linear systems of equations, and if [A|b] and [A |b ] are row equivalent, then the corresponding linear systems are equivalent. By examining the possible row-reduced matrices corresponding to the aug- mented matrix, one can use Theorem 1.1 to obtain the following result, which we state without proof. Theorem 1.2 Consider the system Ax = b, with coefficient matrix A and aug- mented matrix [A|b]. As above, the sizes of b, A, and [A|b] are m × 1, m × n, and m × (n + 1), respectively; in addition, the number of unknowns is n. Below, we summarize the possibilities for solving the system. i. Ax = b is inconsistent (i.e., no solution exists) if and only if rank[A] < rank[A|b]. ii. Ax = b has a unique solution if and only if rank[A] = rank[A|b] = n . iii. Ax = b has infinitely many solutions if and only if rank[A] = rank[A|b] < n. 1
  • 2. To illustrate this theorem, let’s look at the simple systems below. x1 + 2x2 = 1 3x1 + x2 = −2 3x1 + 2x2 = 3 −6x1 − 4x2 = 0 3x1 + 2x2 = 3 −6x1 − 4x2 = −6 The augmented matrices for these systems are, respectively, 1 2 1 3 1 −2 3 2 3 −6 −4 0 3 2 3 −6 −4 −6 . Applying the row-reduction algorithm yields the row-reduced form of each of these augmented matrices. The results are, again respectively, 1 0 −1 0 1 1 1 2 3 0 0 0 1 1 2 3 1 0 0 0 . From each of these row-reduced versions of the augmented matrices, one can read off the rank of the coefficient matrix as well as the rank of the augmented matrix. Applying Theorem 1.2 to each of these tells us the number of solutions to expect for each of the corresponding systems. We summarize our findings in the table below. System rank[A] rank[A|b] n # of solutions First 2 2 2 1 Second 1 2 2 0 (inconsistent) Third 1 1 2 ∞ Homogeneous systems. A homogeneous system is one in which the vector b = 0. By simply plugging x = 0 into the equation Ax = 0, we see that every homogeneous system has at least one solution, the trivial solution x = 0. Are there any others? Theorem 1.2 provides the answer. Corollary 1.3 Let A be an m × n matrix. A homogeneous system of equations Ax = 0 will have a unique solution, the trivial solution x = 0, if and only if rank[A] = n. In all other cases, it will have infinitely many solutions. As a consequence, if n > m—i.e., if the number of unknowns is larger than the number of equations—, then the system will have infinitely many solutions. Proof: Since x = 0 is always a solution, case (i) of Theorem 1.2 is eliminated as a possibility. Therefore, we must always have rank[A] = rank[A|0] ≤ n. By Theorem 1.2, case (ii), equality will hold if and only if x = 0 is the only solution. When it does not hold, we are always in case (iii) of Theorem 1.2; there are thus infinitely many solutions for the system. If n > m, then we need only note that rank[A] ≤ m < n to see that the system has to have infinitely many solutions. 2 2
  • 3. 2 Linear Independence and Dependence A set of k vectors {u1, u2, . . . , uk} is linearly independent (LI) if the equation k j=1 cjuj = 0, where the cj’s are scalars, has only c1 = c2 = · · · = ck = 0 as a solution. Otherwise, the vectors are linearly dependent (LD). Let’s assume the vectors are all m × 1 column vectors. If they are rows, just transpose them. Now, use the basic matrix trick (BMT) to put the equation in matrix form: Ac = k j=1 cjuj = 0, where A = [u1 u2 · · · uk] and c = (c1 c2 · · · ck)T . The question of whether the vectors are LI or LD is now a question of whether the homogeneous system Ac = 0 has a nontrivial solution. Combining this observation with Corollary 1.3 gives us a handy way to check for LI/LD by finding the rank of a matrix. Corollary 2.1 A set of k column vectors {u1, u2, . . . , uk} is linearly independent if and only if the associated matrix A = [u1 u2 · · · uk] has rank[A] = k. Example 2.2 Determine whether the vectors below are linearly independent or linearly dependent. u1 =     1 −1 2 1     , u2 =     2 1 1 −1     , u3 =     0 1 −1 −1     . Solution. Form the associated matrix A and perform row reduction on it: A =     1 2 0 −1 1 1 2 1 −1 1 −1 −1     ⇐⇒     1 0 −2 3 0 1 1 3 0 0 0 0 0 0     . The rank of A is 2 < k = 3, so the vectors are LD. The row-reduced form of A gives us additional information; namely, we can read off the cj’s for which k j=1 cjuj = 0. We have c1 = 2 3 t, c2 = −1 3 t, and c3 = t. As usual, t is a parameter. 3