In this section we completely characterize when a linear system of equations has solutions; and we do so using the notion of rank. Basically, the rank of a linear system is the number of leading coefficients in the reduced row echelon form of the augmented matrix of the given linear system.

## Row Equivalence

Recall a system of linear equations is called consistent if it has at least one solution and is called inconsistent if it has no solutions. Our first goal will be to show the notion of rank is well-defined; that is, we wish to show that every matrix has a unique reduced row echelon form. Thus the rank of a linear system will be a unique number.

Find a $2\times 3$ linear system whose augmented matrix has two different row echelon forms.

**Definition**. If an matrix $A$ can be obtained from another matrix $B$ by a finite number of elementary row operations, then we say $B$ is ** row equivalent** to $A.$

**Theorem**. Every matrix is row equivalent to a matrix in row echelon form.

**Proof**. Let $A$ be an $m\times n$ nonzero matrix, with entries $a_{ij}$, say $$ \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ & & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} $$ Either all entries in the first column of $A$ are nonzero or not. If all entries are zero then the first column of $A$ satisfies the conditions of row echelon form. Otherwise, assume $i$ is the least such that $a_i$ is nonzero. Interchanging rows 1 and $i$ (if needed) we obtain a column where the first entry is nonzero. Now we have a matrix of the following form. $$ \begin{array}{ll} \text{$i$-th row $\rightarrow$} & \begin{bmatrix} a_{i1} & a_{i2} & \cdots & a_{in} \\ a_{21} & a_{22} & \cdots & a_{2n}\\ & & \vdots \\ a_{11} & a_{12} & \cdots & a_{1n} \\ & & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} \end{array} $$ Next we multiply the first row by the multiplicative inverse of its nonzero entry, obtaining 1. For the remaining first column entries $a_{k,m}$ (where $k>1$) multiply the row by the multiplicative inverse of $a_{k,m}$ and add to the first row replacing the $k$-th row. These steps lead to all zero entries in the first column below the leading coefficient. Our matrix is now in the form $$ \begin{bmatrix} 1 & a’_{i2} & \cdots & a’_{in} \\ 0 & a’_{22} & \cdots & a’_{2n}\\ & & \vdots \\ 0 & a’_{m2} & \cdots & a’_{mn} \end{bmatrix} $$ where the $a’_{ij}$ are the scalars obtain from completing the row operations. We repeat this process on the remaining columns taking into account that applying row operations will not change the fact that the previous columns will continue to satisfy the conditions of row echelon form.

**Theorem**. Every matrix is row equivalent to a unique matrix in reduced row echelon form.

**Proof**. Let $A$ be an $m\times n$ matrix. We apply mathematical induction on $n$ for an arbitrary $m.$ Let $n=1.$ Now $A$ is just a matrix with one column and is row equivalent to one of the following matrices. $$ \begin{bmatrix}1\\ 0\\ \vdots\\ 0\end{bmatrix} \quad \text{or} \quad \begin{bmatrix}0 \\ 0 \\ \vdots\\ 0\end{bmatrix} $$ So the case for $n=1$ is clear. Now assume $n>1$ and let $A$ and $A’$ denote the following matrices. $$ A= \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ & & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} \quad \text{and} \quad A’= \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1,n-1} \\ a_{21} & a_{22} & \cdots & a_{2,n-1} \\ & & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{m,n-1} \\ \end{bmatrix} $$ Notice $A’$ is just $A$ with the $n$-th column deleted. By the way $A’$ is defined, any sequence of elementary row operations that takes $A$ into reduced row echelon form also takes $A’$ into reduced row echelon form. By the induction hypothesis, any two matrices $B$ and $C$ that are reduced row echelon forms of $A$ can only differ in the $n$ column. Assume, for a contradiction, that $B$ and $C$ differ only in the $n$-th column. Then there exists an integer $j$ such that the $j$-th row of $B$ is not equal to the $j$-th row of $C.$

## The Rank of a Linear System

Due to the above theorem the following definition is well-defined; meaning if a matrix $A$ is reduced to the unique matrix in reduced row echelon form

**Definition**. The **rank** of a matrix $A$ is the number of leading coefficients in $\operatorname{rref}(A).$

**Example**. Let $a, d, f$ be nonzero constants and let $b, c, e$ be arbitrary constants. Find the rank of the following matrices. $$ A=\begin{bmatrix} 0 & 0 & a \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \qquad B=\begin{bmatrix} 0 & a & 0 \\ 0 & 0 & d \\ 0 & 0 & 0 \end{bmatrix} \qquad C=\begin{bmatrix} a & b & c \\ 0 & d & e \\ 0 & 0 & f \end{bmatrix} $$ Since $a$, $f$, and $d$ are all nonzero, $$ \operatorname{rref}(A)=\begin{bmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \quad \operatorname{rref}(B)=\begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \quad \operatorname{rref}(C)=\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}. $$ Thus the rank of $A$, $B$, and $C$ is 1, 2, and 3, respectively.

**Lemma**. Let $A{x}={b}$ and $C{x}={d}$ be two linear systems of the same size. If the augmented matrices $\begin{bmatrix} A \, | \, {b} \end{bmatrix}$ and $\begin{bmatrix}C \, | \, {d} \end{bmatrix}$ are row equivalent, then the linear systems are equivalent.

**Proof**. The proof follows immediately from the above theorem.

**Theorem**. (** Fundamental Theorem of Linear Systems**) Let A be the coefficient matrix of an $n\times m$ system. Then

(1) If $\operatorname{rank}(A)=n$, then the system is consistent.

(2) If $\operatorname{rank}(A)=m$, then the system has at most one solution.

(3) If $\operatorname{rank}(A)<m$, then the system has either infinitely many solutions or none.

**Proof**. First let’s make two observations. By definition of the reduced row echelon form of a matrix, there is at most one leading 1 in each of the $m$ columns and in each of the $n$ columns. Let A be the coefficient matrix of an $n$ by $m$ system. Then $$ \operatorname{rank}(A) \leq n \qquad \text{and}\qquad \operatorname{rank}(A)\leq m. $$ A linear system is inconsistent if and only if the reduced row echelon form of its augmented matrix has a row of the form \begin{equation} \label{inconsirow} \left[ \begin{array}{cccc|c} 0 & 0 & \cdots & 0 & 1 \end{array} \right]. \end{equation} Moreover, for any linear system with $m$ variables, \begin{equation*} \begin{pmatrix} \text{ number }\\ \text{ of free } \\ \text{ variables } \end{pmatrix} = \begin{pmatrix} \text{ total } \\ \text{ number }\\ \text{ of variables } \end{pmatrix} – \begin{pmatrix} \text{ number } \\ \text{ of leading }\\ \text{ variables } \end{pmatrix} = m-\operatorname{rank}(A) \end{equation*}

- If $\operatorname{rank} A=n$ then each row must contain a leading 1. Thus there is no row of the form in \eqref{inconsirow} and so the system is consistent.
- If $\operatorname{rank}(A)=m$ then there are no free variables. So the only possible choice is for there to be no solutions or exactly one solution.
- If $\operatorname{rank}(A)=t<m$ then there $m-t$ free variables. So the only possible choice is for there to be no solutions or infinitely many solutions.

## Solutions to Linear Systems

The following two corollaries are immediate consequences of the Fundamental Theorem of Linear Systems.

**Corollary**. A linear system with fewer equations than unknowns has either no solutions or infinity many solutions.

**Corollary**. A linear system of $n$ equations in $n$ variables has a unique solution if and only if the rank of its coefficients matrix $A$ is $n$, and in this case $\operatorname{rref}(A)=I_n.$

**Example**. Find the rank of the coefficient matrix and solve the linear system of equations $$ \begin{cases} x_1-x_2+x_3=4\ 3x_1+4x_2-x_3=8\\ 5x_1+9x_2-4x_3=13. \end{cases} $$ We use Gaussian elimination with the augmented matrix to find the rank of the coefficient matrix.

\begin{align*} & \begin{bmatrix} \begin{array}{ccc|c} 1 & -1 & 2 & 4\\ 3 & 4 & -1 & 8\\ 5 & 9 & -4 & 13 \end{array} \end{bmatrix} \begin{array}{c} \stackrel{\longrightarrow}{-3R_1+R_2} \\ \stackrel{\longrightarrow}{-5R_1+R_3} \end{array} \begin{bmatrix} \begin{array}{ccc|c} 1 & -1 & 2 & 4 \\ 0 & 7 & -7 & -4 \\ 0 & 14 & -14 & -7 \end{array} \end{bmatrix} \\ & \begin{array}{c} \stackrel{\longrightarrow}{-\frac{1}{7}R_2} \\ \stackrel{\longrightarrow}{-14R_2+R_3} \end{array} \begin{bmatrix} 1 & -1 & 2 & 4 \\ 0 & 1 & -1 & -\frac{4}{7} \\ 0 & 0 & 0 & 1 \end{bmatrix} \begin{array}{c} \stackrel{\longrightarrow}{\frac{4}{7}R_3+R_2} \\ \stackrel{\longrightarrow}{-4R_3+R_2} \\ \stackrel{\longrightarrow}{R_2+R_1} \end{array} \begin{bmatrix} 1 & 0 & 1 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix} \end{align*}

The number of leading 1’s is 2 and thus $\operatorname{rank}(A)=2.$ Hence the system either has no solutions or infinitely many solutions. The last row represents $0=1$, which means that the system has no solution.

**Example**. Consider the system $$ \begin{cases} y+2k z =0 \\ x+2y+6z =2 \\ k x+2 z =1 \end{cases} $$ where $k$ is a constant.

- For which values of $k$ does the system have a unique solution?
- When are there no solutions?
- When are there infinitely many solutions?

Using Gaussian elimination:

\begin{align*} \stackrel{\longrightarrow}{R_1 \leftrightarrow R_2} \left[ \begin{array}{ccc|c} 1 & 2 & 6 & 2 \\ 0 & 1 & 2k & 0 \\ k & 0 & 2 & 1 \end{array} \right] & \stackrel{\longrightarrow}{-k R_1+R_2} \left[ \begin{array}{ccc|c} 1 & 2 & 6 & 2 \\ 0 & 1 & 2k & 0 \\ 0 & -2k & -6k+2 & -2k+1 \end{array} \right] \\ & \begin{array}{c} \stackrel{\longrightarrow}{-2R_2+R_1} \\ \stackrel{\longrightarrow}{2kR_2+R_3} \end{array} \left[ \begin{array}{ccc|c} 1 & 0 & -4k+6 & 2 \\ 0 & 1 & 2k & 0 \\ 0 & 0 & 4k^2-6k+2 & -2k+1 \end{array} \right] \end{align*}

Notice $ 4k^2-6k+2=-2(-2k+1)(k-1)=0 $ when $k=1/2$ and $k=1.$ (a) When $k\neq 1/2$ and $k\neq 1$ there is a unique solution. (b) When $k \neq 1/2$ and $k=1$, this system has no solutions. (c) When $k=1/2$ this system has infinitely many solutions.

**Example**. Solve the linear system of equations $$ \begin{cases}

(3+i)x_1+(1+i)x_2=4+4i\\ x_1-x_2=2i \end{cases} $$ where $x_1$ and $x_2$ are complex variables. We convert the system into a linear system with real variables. Let $x_1=y_1+i z_1$ and $x_2=y_2+i z_2.$ Now substation into the original system leads to the system $$ \begin{cases} 3y_1-z_1+(y_1+3z_1)i+(y_2-z_2)+(y_2+z_2)i=4+4i \\ (y_1-y_2)+(z_1-z_2)i=0+2i \end{cases} $$ Equating real and imaginary parts leads to the system $$ \begin{cases} 3y_1+y_2-z_1-z_2=4\\ y_1+y_2+3z_1+z_2 =4\\ y_1-y_2=0\\ z_1-z_2=2 \end{cases} $$ The solutions are $y_1=1$, $y_2=1$, $z_1=1$, and $z_2=-1.$ Thus the solutions to the original system are $x_1=1+i$ and $x_2=1-i.$

## Homogeneous Linear Systems

**Definition**. A linear system of the form $A x = 0$ is called a ** homogeneous** linear system; that is, if all of the constant terms in the linear system are zero.

In particular, if $A$ and $B$ are row equivalent $m \times n$ matrices, then the homogenous systems $A{x}={0}$ and $B{x}={0}$ are equivalent.

**Lemma**. If $A$ is an $n\times n$ matrix and the system $A {x}={0}$ has no nontrivial solution, then $A$ is row equivalent to $I_n.$

**Proof**. If $A{x}={0}$ has no nontrivial solutions, then the trivial solution is its unique solution. From above, $\operatorname{rref}(A)=I_n$ and $A$ is row equivalent to a unique matrix in reduced row echelon form, thus $A$ is row equivalent to $I_n.$

**Lemma**. Let $A {x} = {0}$ be a linear homogeneous system.

(1) All homogeneous systems are consistent.

(2) A homogeneous system with fewer equations than unknowns has infinitely many solutions.

(3) If ${x}_1$ and ${x}_2$ are solutions, then ${x}_1+{x}_2$ is also a solution.

(4) If ${x}_1$ is a solution, then $k {x}_1$ is also a solution.

There is a strong relationship between the solutions to a linear system $A{x} = {b}$ and the solutions to the corresponding homogeneous system, $A{x}= {0}.$

**Theorem**. If ${u}$ is any specific solution to the linear system $A{x}={b}$, then the entire solution set of $A{x}={b}$ can be described as $$ \{{u}+{v} \, | \, {v} \text{ is any solution to } A {x}={0}\}. $$

**Proof**. Let ${u}$ be any particular solution to the system $A{x}={b}$ and let ${v}$ be any solution to the system $A{x}={0}.$ Then $$ A({u}+{v}) =A{u}+A{v} ={b}+{0} ={b}, $$ which shows that every vector of the form ${u}+{v}$ is a solution to the system $A{x}={b}.$ Conversely, let ${w}$ be an arbitrary solution to the system $A{x}={b}.$ Notice $$ A({w}-{u}) =A{w} – A{u} ={b}-{b} ={0}, $$ which shows ${w}-{u}$ is a solution to the system $A{x}={0}.$ Set ${v}={w}-{u}$, then ${w}={u}+{w} {u}={u}+{v}$ where ${v}$ is a solution to the system $A{x}={0}.$

**Corollary**. If $A$ is an $n\times n$ and $A{x}={0}$ has no nontrivial solutions, then the system $A{x}={b}$ has a unique solution.

## Exercises on Solving Linear Equations

**Exercise**. Find an inconsistent system of two linear equations in three unknowns. Describe the situation geometrically.

**Exercise**. Find the rank of the matrix.

- $ \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 3\\ 0 & 0 & 1 \end{bmatrix} $
- $ \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 3 \\ 0 & 0 & 3 \end{bmatrix} $
- $ \begin{bmatrix} 1 & 2 & 3 \\ 0 & 1 & 3 \\ 0 & 0 & 1 \end{bmatrix} $
- $ \begin{bmatrix} 2 & 2 & 2 \\ 2 & 2 & 2 \\ 2 & 2 & 2 \\ 2 & 2 & 2 \end{bmatrix} $
- $ \begin{bmatrix} 1 & 4 & 7 \\ 2 & 5 & 8 \\ 3 & 6 & 9 \end{bmatrix} $
- $ \begin{bmatrix} 1 & 4 & 7 \\ 2 & 5 & 8 \\ -3 & -6 & -9 \end{bmatrix} $

**Exercise**. (a) If the rank of a $4\times 4$ matrix $A$ is 4, what is $\operatorname{rref}(A)$? (b) If the rank of a $5\times 3$ matrix $A$ is 3, what is $\operatorname{rref}(A)$? (c) If the rank of a $2\times 5$ matrix $A$ is 1, what is $\operatorname{rref}(A)$?

**Exercise**. Find the rank of the following matrices.

- $ \begin{bmatrix} a & 0 & 0 \\ 0 & c & 0\\ 0 & 0 & f \end{bmatrix}$
- $ \begin{bmatrix} a & b & c \\ 0 & c & d \\ 0 & 0 & f \end{bmatrix} $
- $ \begin{bmatrix} a & 0 & 0 \\ b & c & 0\\ d & e & f \end{bmatrix} $

where $a$, $d$, $f$ are nonzero and $b$, $c,$ and $e$ are arbitrary scalars.

**Exercise**. Find the rank of the system of equations.

- $ \begin{cases} x+2y+3z=0 \\ 2x+3y+4z=0 \\ 3x+4y+6z=0 \end{cases} $
- $ \begin{cases} x+2y+3z=2\\ 2x+3y+4z=-2\\ 3x+4y+6z=2 \end{cases} $
- $ \begin{cases} x+2y+3z=0 \\ 2x+3y+4z=1\\ 3x+4y+6z=3 \end{cases} $
- $ \begin{cases} x+2y+3z=a\\ 2x+3y+4z=b \\ 3x+4y+6z=c \end{cases} $

**Exercise**. Find all solutions to the homogenous system.

- $\begin{cases}4x_1-x_2=0 \\7x_1+3x_2=0\\-8x_1+6x_2=0 \end{cases}$
- $\begin{cases} x_1-2x_2+x_3=0\\ 3x_1+2x_3+x_4=0\\ 4x_2-x_3-x_4=0\\ 5x_1+3x_3-x_4=0 \end{cases}$
- $\begin{cases} x_1-3x_2=0\\ -2x_1+6x_2=0\\ 4x_1-12x_2=0 \end{cases}$
- $\begin{cases} x_1+x_2-x_3=0\\ 4x_1-x_2+5x_3=0\\ 2x_1-x_2-2x_3=0\\ 3x_1+2x_2-x_3=0 \end{cases} $

**Exercise**. Show that the homogenous system of linear equations $$ \begin{cases} a x+by=0 \\ cx+dy =0 \end{cases} $$ has an infinite number of solutions if and only if $ad-bc=0.$

**Exercise**. For any positive integer $n$, find a system of $n$ equations in two variables that has infinitely many solutions.

**Exercise**. If possible, find a condition of the coefficients of the homogenous system of linear equations so that (i) the system only has the zero solution, and (ii) the system has infinitely many solutions.

- $ \begin{cases} a_{11}x_1+a_{12}x_2+a_{13}x_3=0 \\ a_{21}x_1+a_{22}x_2+a_{23}x_3=0 \end{cases} $
- $ \begin{cases} a_{11}x_1+a_{12}x_2+a_{13}x_3=0 \\ a_{21}x_1+a_{22}x_2+a_{23}x_3=0 \\ a_{31}x_1+a_{32}x_2+a_{33}x_3=0 \end{cases} $

**Exercise**. Show that if a system of linear equations is inconsistent, then its reduced row echelon form has a row of the form $$ \left[ \begin{array}{cccc|c}0 & 0 & \cdots & 0 & 1 \end{array} \right]. $$

**Exercise**. Determine the values of $k$ for which the system has nontrivial solutions.

- $ \begin{cases} 2x_1-3x_2+5x_3=0\\ -x_1+7x_2-x_3=0\\ 4x_1-11x_2+k x_3=0 \end{cases} $
- $ \begin{cases} x_1-2x_2-5x_3=0\\ 2x_1+3x_2+x_3=0\\ x_1-7x_2-k x_3=0 \end{cases} $

**Exercise**. Show that if $AX=0$ is a homogenous system with an infinite number of solutions and the system $AX=B$ has a solution, then $AX=B$ must have an infinite number of solutions.

**Exercise**. Find an example, where possible, for each of the following.

- A linear system of four equations in four unknowns that has a line as a solution set.
- A linear system of four equations in four unknowns that has a plane as a solution set.
- A linear system of four equations in three unknowns that has a line as a solution set.
- A linear system of four equations in two unknowns that has a plane as a solution set.

**Exercise**. Determine whether or not the following system is consistent. $$ \begin{bmatrix} 0 & i & 1-i \\ -i & 0 & i\\ 1-i & -i & 0 \end{bmatrix} \cdot \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} $$

**Exercise**. Show that if $AX=0$ for all vectors $X$, then $A=0.$

**Exercise**. Under what conditions will $k$ planes $a_j x +b_j y+c_j z=d_j$ for $j=1, 2, …, k$ intersect in exactly one point?

**Exercise**. Given that $AX=B$ is consistent and of rank $r$, for what sets of $r$ unknowns can one solve?

**Exercise**. If possible, write the matrix $A$ as a linear combination of the matrices $$

\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \qquad \text{and} \qquad \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}. $$

- $ A= \begin{bmatrix} 3 & 0 \\ 0 & 2 \end{bmatrix} $
- $ A= \begin{bmatrix} 4 & 1 \\ 0 & -3 \end{bmatrix} $
- $ A= \begin{bmatrix} 11 & 2 \\ 0 & -4 \end{bmatrix} $

**Exercise**. Let $A Z=B$ be a given system of linear equations where $A_{n\times n}$ and $B_{n\times 1}$ are complex matrices. Let $Z=X+i Y.$ Show that the original complex $n\times n$ system is equivalent to the $2n\times 2n$ real system $$ \begin{cases} CX-DY=S \\ CX+DY=T \end{cases} $$ where $B=S+iT.$