Invertible Matrix and It’s Properties

Okay, so you know what a linear transformation is, but what exactly is an invertible linear transformation. In this article, I cover invertible matrices and work through several examples.

Invertible Linear Transformations

An $n\times n$ matrix $A$ is called invertible if and only if there exists a matrix $B$ such that $A B=I_n$ and $BA=I_n.$ Using the inverse of a matrix we also define the inverse of a linear transformation. Let $T(vec x)=Ax$ be a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^n.$ If the matrix $A$ has inverse $A^{-1}$, then the linear transformation defined by $A^{-1} x$ is called the inverse transformation of $T$ and is denoted by $T^{-1}(x)=A^{-1} x.$

A function $T$ from $X$ to $Y$ is called invertible if the equation $T(x)=y$ has a unique solution $x\in X$ for each $y\in Y.$ A square matrix $A$ is called invertible if the linear transformation $y=T(x)=Ax$ is invertible. In this case, then matrix of $T^{-1}$ is denoted by $A^{-1}.$ If the linear transformation is invertible, then its inverse is $x = T^{-1} (y)=A^{-1} y.$

Example. (Invertible Matrix) Find the inverse transformation of the following linear transformation: $$ \begin{array}{rl} y_1 = & x_1+3x_2+3x_3 \\ y_2 = & x_1+4x_2+8x_3 \\ y_3 = & 2x_1+7x_2+12x_3 \end{array}. $$ To find the inverse transformation we solve for $x_1, x_2, x_3$ in terms of $y_1,.y_2,y_3.$ To do this we find the inverse matrix of $ A= \begin{bmatrix} 1 & 3 & 3 \\ 1 & 4 & 8 \\ 2 & 7 & 12 \end{bmatrix}.$ Applying elementary-row operations,

$$ \begin{bmatrix} 1 & 3 & 3 & 1 & 0 & 0 \\ 1 & 4 & 8 & 0 & 1 & 0 \\ 2 & 7 & 12 & 0 & 0 & 1 \end{bmatrix}\begin{array}{c} \stackrel{\longrightarrow}{R_2-R_1} \\ \stackrel{\longrightarrow}{-2R_1+R_3} \end{array} \begin{bmatrix}1 & 3 & 3 & 1 & 0 & 0 \\ 0 & 1 & 5 & 1 & 1 & 0 \\ 0 & 1 & 6 & -2 & 0 & 1 \end{bmatrix} $$ $$ \stackrel{\longrightarrow}{-R_2+R_3} \begin{bmatrix} 1 & 3 & 3 & 1 & 0 & 0 \\ 0 & 1 & 5 & -1 & 1 & 0 \\ 0 & 0 & 1 & -1 & -1 & 1 \end{bmatrix}\stackrel{\longrightarrow}{-5R_3+R_2} \begin{bmatrix}1 & 3 & 3 & 1 & 0 & 0 \\ 0 & 1 & 0 & 4 & 6 & -5 \\ 0 & 0 & 1 & -1 & -1 & 1 \end{bmatrix} $$ $$ \stackrel{\longrightarrow}{-3R_3+R_1} \begin{bmatrix} 1 & 3 & 0 & 4 & -3 & 3 \\ 0 & 1 & 0 & 4 & 6 &-5 \\ 0 & 0 & 1 & -1 & -1 & 1 \end{bmatrix}\stackrel{\longrightarrow}{-3R_2+R_1} \begin{bmatrix} 1 & 0 & 0 & -8 & -15 & 12 \\ 0 & 1 & 0 & 4 & 6 & -5 \\ 0 & 0 & 1 & -1 & -1 & 1 \end{bmatrix}$$ we find $ A^{-1}= \begin{bmatrix} -8 & -15 & 12 \\ 4 & 6 & -5 \\ 1 & -1 & 1 \end{bmatrix}. $

Therefore the requested linear transformation is $$ \begin{array}{rl} x_1 = & -8y_1-15y_2+12y_3 \\ x_2 = & 4y_1+6y_2-5y_3 \\ x_3 = & -y_1-y_2+y_3. \end{array} $$

Invertible Matrices

Of course inverse transformations makes sense in terms of inverse functions; that is, if $T^{-1}$ is the inverse transformation of $T$ then $(T\circ T^{-1})(x)=x$ and $(T^{-1 }\circ T)(x)=x.$ For example, for $T$ given we illustrate $$ (T^{-1}\circ T)\begin{bmatrix} 1 \\ 2 \\ 3\end{bmatrix} = T^{-1}\begin{bmatrix}2 \\ 4 \\ -5 \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \\ 3\end{bmatrix} $$ as one can verify.

Theorem. (Invertible Matrix) Let $A$ be an $n\times n$ matrix. Then

(1) $A$ is invertible if and only if rref($A$)$=I_n$,

(2) $A$ is invertible if and only if $\mathop{rank}(A)=n$, and

(3) $A$ is invertible if and only if $A^{-1} A= I_n$ and $A A^{-1}=I_n.$

Example. Find the inverse of the linear transformation \begin{align*} & y_1 = 3x_1 +5x_2 \\ & y_2 =3x_1+4x_2. \end{align*} Reducing the system $$ \begin{bmatrix} 3x_1+5x_2& =y_1 \\ 3x_1+4x_2 & =y_2 \end{bmatrix} $$ we obtain $$ \begin{bmatrix} x_1 & =-\frac{4}{3}y_1+\frac{5}{3}y_2 \\ x_2 & = y_1-y_2 \end{bmatrix}. $$

To find the inverse of an $n \times n$ matrix $A$, form the augmented matrix $[ \, A \, | \, I_n \, ]$ and compute $\mathop{rref}(\, [ \, A \, | \, I_n \, ] \, ).$ If $\mathop{rref}(\, [ \, A \, | \, I_n \, ] \, ) $ is of the form $\mathop{rref}(\, [ \, I_n \, | \, B \, ] \, ) $, then $A$ is invertible and $A^{-1}=B.$ Otherwise $A$ is not invertible. For example

\begin{equation} \label{invtranseq} =\mathop{rref}\left( \begin{bmatrix} 1 & -1 & 1 & 1 & 0 & 0 \\ 1 & 0 & 1 & 0 & 1 & 0 \\ -1 & -2 & 0 & 0 & 0 & 1 \end{bmatrix} \right) = \begin{bmatrix} 1 & 0 & 0 & 2 & -2 & -1 \\ 0 & 1 & 0 & -1 & 1 & 0 \\ 0 & 0 & 1 & -2 & 3 & 1 \end{bmatrix} = [ \, I_3 \, | \, B \, ] \end{equation}

shows $B=A^{-1}$ where

$$B= \begin{bmatrix} 2 & -2 & -1 \\ -1 & 1 & 0 \\ -2 & 3 & 1 \end{bmatrix} \quad and \quad A= \begin{bmatrix} 1 & -1 & 1 \\ 1 & 0 & 1 \\ -1 & -2 & 0 \end{bmatrix} $$

as one can verify, by showing $AB=I_3$ and $BA=I_3.$

Theorem. Let $A$ and $B$ be $n \times n$ matrices. Then

(1) if $A$ and $B$ are invertible matrices, then $B A$ is invertible as well and $$ (B A)^{-1}= A^{-1}B^{-1} $$

(2) if $B A= I_n$, then $A$ and $B$ are both invertible, $$ A^{-1}=B, \qquad B^{-1}=A, \qquad \text{ and } \qquad AB = I_n. $$

Examples of Invertible Matrices

Example. Find the inverse matrices of $ A= \begin{bmatrix} 2 & 3 \\ 6 & 9 \end{bmatrix} $ and $ B= \begin{bmatrix} 1 & 2 \\ 3 & 9 \end{bmatrix} .$ Since $\text{rref}(A)=\begin{bmatrix} 1/2 & 3/2 \\ 0 & 0 \end{bmatrix}\neq I_2$, $A^{-1}$ does not exist. The inverse of $B$ does exist and $B^{-1}=\begin{bmatrix} 3 & -2/3 \\ -1 & 1/3 \end{bmatrix}$ since $B^{-1}B=I_2$ and $B B^{-1}=I_2.$

Example. Show that $A=\begin{bmatrix} a & b & c& d \end{bmatrix}$ is invertible if and only if $a d- b c \neq 0$ and when possible

\begin{equation} A^{-1} = \frac{1}{a d – b c} \begin{bmatrix} d & -b \ -c & a \end{bmatrix}. \label{eq:twodet} \end{equation}

We proceed to find the inverse:

$$ \begin{bmatrix} a & b & 1 & 0 \\ c & d & 0 & 1 \end{bmatrix}\begin{array}{c} \stackrel{\longrightarrow}{\frac{1}{a} R_1} \\ \stackrel{\longrightarrow}{\frac{1}{c} R_2} \end{array} \begin{bmatrix} 1 & \frac{b}{a} & \frac{1}{a} & 0 \\ 1 & \frac{d}{c} & 0 & \frac{1}{c} \end{bmatrix}\begin{array}{c} \stackrel{\longrightarrow}{-R_1+R_2} \end{array} \begin{bmatrix} 1 & \frac{b}{a} & \frac{1}{a} & 0 \\ 0 & \frac{ad-bc}{ac} & \frac{-1}{a} & \frac{1}{c} \end{bmatrix} $$ $$ \begin{array}{c} \stackrel{\longrightarrow}{\frac{ac}{ad-bc} R_2} \end{array} \begin{bmatrix} 1 & \frac{b}{a} & \frac{1}{a} & 0 \\ 0 & 1 & \frac{-c}{ad-bc} & \frac{a}{ad-bc} \end{bmatrix}\begin{array}{c} \stackrel{\longrightarrow}{\frac{-b}{a}R_2+R_1} \end{array} \begin{bmatrix} 1 & 0 & \frac{d}{ad-bc} & \frac{-b}{ad-bc} \\ 0 & 1 & \frac{-c}{ad-bc} & \frac{a}{ad-bc} \end{bmatrix} $$

Therefore, $A$ is an invertible matrix if and only if $a d- b c \neq 0$ and \eqref{eq:twodet} holds.

Example. For which values of constants $a, b, c,$ is the matrix $$ A= \begin{bmatrix} 0 & a & b \\ -a & 0 & c \ -b & -c & 0 \end{bmatrix} $$ invertible? Suppose $a\neq 0.$ Applying row-operations

$$ \begin{bmatrix} 0 & a & b & 1 & 0 & 0\\ -a & 0 & c & 0 & 1 & 0 \\ -b & -c & 0 & 0 & 0 & 1 \end{bmatrix}\begin{array}{c} \stackrel{\longrightarrow}{R_2\leftrightarrow R_1} \end{array} \begin{bmatrix} -a & 0 & c & 0 & 1 & 0 \\ 0 & a & b & 1 & 0 & 0\\ -b & -c & 0 & 0 & 0 & 1 \end{bmatrix} $$ $$ \stackrel{\longrightarrow}{-\frac{1}{a}R_1} \begin{bmatrix} 1 & 0 & -\frac{c}{a} & 0 & -\frac{1}{a} & 0 \\ 0 & a & b & 1 & 0 & 0\\ -b & -c & 0 & 0 & 0 & 1 \end{bmatrix}\stackrel{\longrightarrow}{bR_1+R_3} \begin{bmatrix} 1 & 0 & -\frac{c}{a} & 0 & -\frac{1}{a} & 0 \\ 0 & a & b & 1 & 0 & 0\\ 0 & -c & \frac{-bc}{a} & 0 & \frac{-b}{a} & 1 \end{bmatrix} $$ $$ \stackrel{\longrightarrow}{\frac{1}{a}R_2} \begin{bmatrix} 1 & 0 & -\frac{c}{a} & 0 & -\frac{1}{a} & 0 \\ 0 & 1 & \frac{b}{a} & \frac{1}{a} & 0 & 0\\ 0 & -c & \frac{-bc}{a} & 0 & \frac{-b}{a} & 1 \end{bmatrix}\stackrel{\longrightarrow}{cR_2+R_3} \begin{bmatrix} 1 & 0 & -\frac{c}{a} & 0 & -\frac{1}{a} & 0 \\ 0 & 1 & \frac{b}{a} & \frac{1}{a} & 0 & 0\\ 0 & 0 & 0 & \frac{c}{a} & \frac{-b}{a} & 1 \end{bmatrix} $$

Thus, if $a\neq 0$ then $A$ is not invertible, since $\mathop{rref}{(A)}\neq I_3.$ If $a=0$, then clearly, $\mathop{rref}{(A)}\neq I_3$, and so $A$ is not invertible in either case. Therefore, there are no constants $a, b, c$ for which $A$ is an invertible matrix.

Corollary. Let $A$ be an $n \times n$ matrix.

Consider a vector $vec b$ in $\mathbb{R}^n.$ If $A$ is invertible, then the system $A x = b$ has the unique solution $x = A^{-1} b.$ If $A$ is non-invertible, then the system $A x = b$ has infinitely many solutions or none. The system $A x = 0$ has $x = 0$ as a solution. If $A$ is invertible, then this is the only solution. If $A$ is non-invertible, then the system $A x= 0$ has infinitely many solutions.

Example. Find all invertible matrices $A$ such that $A^2=A.$ Since $A$ is invertible we multiply by $A^{-1}$ to obtain: $$ A=IA=(A^{-1}A)A=A^{-1}(A^2)=A^{-1}A=I_n $$ and therefore $A$ must be the identity matrix.

Example. For which values of constants $b$ and $c$ is the matrix $$ B= \begin{bmatrix}0 & 1 & b \\ -1 & 0 & c \\ -b & -c & 0 \end{bmatrix} $$ invertible? The matrix $B$ is not invertible for any $b$ and $c$ since$$ \text{rref}(B)= \begin{bmatrix}1 & 0 & -c \\ 0 & 1 & b \\ 0 & 0 & 0 \end{bmatrix}\neq I_3 $$ for all $b$ and $c.$

Example. Find the matrix $A$ satisfying the equation $$ \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} A \begin{bmatrix} 2 & 0 \\ 0 & -2 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} .$$ Let $B=\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}$ and $C=\begin{bmatrix} 2 & 0 \\ 0 & -2 \end{bmatrix}.$ Then $$ B^{-1}=\begin{bmatrix} 1& 0 \\ 0 &-1\end{bmatrix} \qquad \text{and}\qquad C^{-1}=\begin{bmatrix} 1/2 & 0 \\ 0 & -1/2 \end{bmatrix}. $$ Multiplying on the right by $B^{-1}$ and on the left by $C^{-1}$ we find $$ A=B^{-1}\begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}C^{-1} =\begin{bmatrix} 1/2 & -1/2 \ -1/2 & 1/2\end{bmatrix}. $$

Example. Find the matrix $A$ satisfying the equation $$ \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} A \begin{bmatrix} 2 & 0 \\ 0 & -2 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} .$$ Let $B=\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}$ and $C=\begin{bmatrix} 2 & 0 \\ 0 & -2 \end{bmatrix}.$ Then $$ B^{-1}=\begin{bmatrix} 1& 0 \\ 0 &-1\end{bmatrix} \qquad \text{and}\qquad C^{-1}=\begin{bmatrix} 1/2 & 0 \\ 0 & -1/2 \end{bmatrix}. $$ Multiplying on the right by $B^{-1}$ and on the left by $C^{-1}$ we find $$ A=B^{-1}\begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}C^{-1} =\begin{bmatrix} 1/2 & -1/2 \ -1/2 & 1/2\end{bmatrix}. $$

Example. Suppose that $A$, $B$, and $C$ are $n\times n$ matrices and that both $A$ and $B$ commute with $C.$ Show that $AB$ commutes with $C.$ To show that $AB$ commutes with $C$ we need to show $(AB)C=C(AB).$ This is easy since $$ (AB)C=A(BC)=A(CB)=(AC)B=(CA)B=C(AB). $$ Can you justify each step?

Example. Show that $AB=BA$ if and only if $(A-B)(A+B)=A^2-B^2.$ Suppose $AB=BA$ we will show $(A-B)(A+B)=A^2-B^2.$ Starting with the left-hand side we obtain \begin{align} (A-B)(A+B) & =(A-B)A+(A-B)B =A^2-BA+AB-B^2 \\ & =A^2-BA+BA-B^2 =A^2-B^2 \end{align} Now suppose $(A-B)(A+B)=A^2-B^2$, we will show $AB=BA.$ This is easy since $$ (A-B)(A+B) =(A-B)A+(A-B)B =A^2-BA+AB-B^2 =A^2-B^2 $$ implies $-BA+AB=0$ as desired.

David A. Smith at Dave4Math

David Smith (Dave) has a B.S. and M.S. in Mathematics and has enjoyed teaching precalculus, calculus, linear algebra, and number theory at both the junior college and university levels for over 20 years. David is the founder and CEO of Dave4Math.

Leave a Comment