The Kernel of a Matrix (and Image)

I discuss the kernel of a linear transformation and its basic properties. After that, I discuss the image of a linear transformation and its basic properties. Then, I investigate the Rank-Nullity Theorem, which combines the dimension of the image space (rank) and the dimension of the kernel space (nullity) into a single beautiful equation.

We discuss the kernel of a (matrix) linear transformation and its basic properties. The dimension of the kernel can often be calculated, and in doing, we gain information about the linear transformation. Similarly, we discuss the image of a linear transformation and its basic properties. We then investigate the Rank-Nullity Theorem (sometimes called the Fundamental Theorem of Linear Algebra) which combines the dimension of the image space (rank) and the dimension of the kernel space (nullity) into a single beautiful equation.

The Kernel and Image of a Matrix

We discuss the kernel and image of a linear transformation. We show how they can be realized as geometric objects and demonstrate how to find spanning sets for them.

Definition. Let $T$ be a linear transformation from $\mathbb{R}^m$ to $\mathbb{R}^n$ with $n\times m$ matrix $A.$ The kernel of $T$, denoted by $\ker(T)$, is the set of all vectors $x$ in $\mathbb{R}^n$ such that $T(x)=A x = 0.$ The image of $T$, denoted by $\operatorname{im}(T)$, is the set of all vectors in $\mathbb{R}^n$ of the form $T (x)=A x.$ The kernel and image of a matrix $A$ of $T$ is defined as the kernel and image of $T.$

Now to adequately describe the kernel and image of a linear transformation we need the concept of the span of a collection of vectors. We will see that one of the best ways to describe the kernel and image of a linear transformation is to describe them in terms of collections of linear combinations of vectors.

Properties of the Kernel of a Matrix

The next two lemmas describe basic properties of the kernel.

Lemma. The kernel of any linear transformation $T$ has the following properties: $0 \in \ker(T)$, if $v, w\in \ker(T)$, then $v+w\in \ker(T)$, and if $v\in \ker(T)$ and $k\in \mathbb{R}$, then $kv\in \ker(T).$

Proof. Let $A$ be the matrix for $T.$ Then $A0=0$ which shows $0 \in \ker(T).$ If $v, w\in \ker(T)$, then $Av=0$ and $Aw=0.$ Thus, $Av+Aw=A(v+w)=0$ implying $v+w\in \ker(T).$ If $v\in \ker(T)$ and $k\in \mathbb{R}$, then $0=Av$ implying $A(k v)=0.$ Thus, $kv\in \ker(T).$

Lemma. Let $T$ be a linear transformation with matrix $A.$

If $A$ is an $n\times n$ matrix, then $\ker(A)=\{0\}$ if and only if $\mathop{rank} (A)=n.$

If $A$ is an $n\times m$ matrix, then $\ker(A)={0} \implies m\geq n$, $m>n \implies \ker(A)$ contains non-zero vectors,

If $A$ is a square matrix, then $\ker(A)={0}$ if and only if $A$ is invertible.

Proof. If $T$ is a linear transformation $T(x )=Ax$ from $\mathbb{R}^m$ to $\mathbb{R}^n$ where $m>n$, then there will be free variables for the equation $T(x)=Ax=0$; that is the system will have infinitely many solutions. Therefore, the kernel of $T$ will consists of infinitely many vectors. If $m=n$ and for an invertible $n\times n$ matrix $A$, how do we find $\ker(A)$? Since $A$ is invertible, $Ax=0$ can be solved by $A^{-1}(A x)=A^{-1}0$ showing $x=0$; that is the only solution to the system $A x=0$ is $0$ so that $\ker(A)={0}$ whenever $A$ is invertible.

Properties of the Image

Lemma. The image of any linear transformation $T$ has the following properties: $0 \in \operatorname{im}(T)$, if $v, w\in \operatorname{im}(T)$, then $v+w\in \operatorname{im}(T)$, and if $v\in \operatorname{im}(T)$ and $k\in \mathbb{R}$, then $kv\in \operatorname{im}(T).$

Proof. Let $A$ be the matrix for $T.$ Then $A0=0$ which shows $0 \in \operatorname{im}(T).$ If $v, w\in \operatorname{im}(T)$, then there exists $x$ and $y$ such that $Ax=v$ and $Ay=w.$ Thus, $v+w=Ax+Ay=A(x+y)$ implying $v+w\in \operatorname{im}(T).$ If $v\in \operatorname{im}(T)$ and $k\in \mathbb{R}$, then there exists $u$ such that $v=Au$ implying $A(k u)=k v.$ Thus, $kv\in \operatorname{im}(T).$

Theorem. Let $T:\mathbb{R}^m\to \mathbb{R}^n$ be a linear transformation with matrix $A.$ Then $\operatorname{im}(T)=\operatorname{span}(v_1, \ldots,v_m)$ where $v_1, \ldots, v_m$ are the column vectors of $A.$

Proof. Since $v_1, \ldots,v_m$ are the column vectors of $A$ \begin{equation} \label{lineq} T(x)= A x = \begin{bmatrix} v_1 & \cdots & v_m \end{bmatrix} \begin{bmatrix}x_1 \\ \vdots \\ x_m\end{bmatrix} =x_1 v_1+\cdots + x_m v_m. \end{equation} If $u\in \operatorname{span}(v_1, \ldots,v_m)$ then there exists $x_1, \ldots, x_m$ such that $$ u=x_1 v_1+\cdots +x_m v_m. $$ We have $u=T(x)$ for $x\in \mathbb{R}^m.$ Thus, $u\in \operatorname{im}(T).$ Conversely, assume $u\in\operatorname{im}(T).$ Then there exists $x$ such that $u=T(x).$ There exists $x_1, \ldots,x_m$ such that $u=x_1 v_1+\cdots +x_m v_m.$ Therefore, $u \in \operatorname{span}(v_1, \ldots,v_m)$ and so $\operatorname{im}(T)=\operatorname{span}(v_1, \ldots,v_m)$ follows.

Example. Find vectors that span $\ker(A)$ and $\operatorname{im}(A)$ given $$ A=\begin{bmatrix}2 & 1 & 3 \\ 3 & 4 & 2 \\ 6 & 5 & 7 \end{bmatrix}.$$ Describe $\operatorname{im}(A)$ geometrically.

Solution. To first find a spanning set of $\ker(A)$ we solve the system $Ax=0.$ We use the augmented matrix and elementary row operations and find reduced row-echelon form $$ \operatorname{rref}(A)= \begin{bmatrix} 1 & 0 & 2 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix}. $$ Thus the solution set is $$ \begin{bmatrix}x_1 \ x_2 \ x_2\end{bmatrix} = t\begin{bmatrix}-2 \ 1 \ 1\end{bmatrix}:=t{w} $$ where $t\in \mathbb{R}.$ Therefore $\ker(A)=\operatorname{span}({w}).$ Next we find $\operatorname{im}(A).$ To do so let $v_1, v_2, v_3 $ be the column vectors of the matrix $A.$ Since $\operatorname{im}(A)$ is spanned by the columns of $A$ and $v_3=2v_1+(-1)v_2$, we find $\operatorname{im}(A)=\operatorname{span}\left(v_1, v_2\right).$ Therefore the image of $A$ is a plane in $\mathbb{R}^3$ that passes through the origin.

Example. Give an example of a matrix $A$ such that $\operatorname{im}(A)$ is the plane with normal vector $w=\begin{bmatrix}1 \\ 3 \\ 2 \end{bmatrix}$ in $\mathbb{R}^3.$

Solution. Since $w$ is a normal vector we let $A=\begin{bmatrix} 1& 3 & 2\end{bmatrix}.$ First we find $\ker(A)$ because $$ \begin{bmatrix} 1& 3 & 2\end{bmatrix} \begin{bmatrix}x \\ y \\ z\end{bmatrix}=\begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix} $$ is the plane $x+3y+2z=0$ where $w$ is a normal vector. Let $z=t$ and $y=s$, then $x=-3s-2t$ so the solutions to the system $Ax=0$ are $$\begin{bmatrix}x \\ y \\ z\end{bmatrix} = \begin{bmatrix}-3s-2t \\ s \\ t\end{bmatrix} = s\begin{bmatrix}0 \\ 1 \\ -3\end{bmatrix} +t \begin{bmatrix}1 \\ 0 \\ -2\end{bmatrix} :=su+t v $$ where $s$ and $t$ are real numbers; and thus $\ker(A)=\operatorname{span}({u}, {v}).$ These vectors yield the image of $A$ since $\operatorname{im}(A)$ is the plane with normal vector $w$ so $A$ is one such matrix; and $\operatorname{im}(A) =\operatorname{span}({u}, {v})$ is the plane in $\mathbb{R}^3$ with normal vector ${w}.$

Nullity is the Dimension of the Kernel of a Matrix

The kernel, being the most important subspace, has a special name for its dimension; namely, the dimension of $\ker A$ is called the nullity of $A.$

Example. Find vectors that span $\ker(A)$ and $\operatorname{im}(A)$ given $$ A= \begin{bmatrix} 1 & -1 & -1 & 1 & 1 \\ -1 & 1 & 0 & -2 & 2 \\ 1 & -1 & -2 & 0 & 3 \\ 2 & -2 & -1 & 3 & 4 \end{bmatrix}. $$

Solution. We will solve the system $Ax=0$ to find $\ker(A).$ Using the augmented matrix and elementary row operations, we find $$ \operatorname{rref}(A) = \begin{bmatrix} 1 & -1 & 0 & 2 & 0 & 0 \\ 0 & 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix}. $$ So the solutions to the system $Ax=0$ are $$ \begin{bmatrix}x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5\end{bmatrix} = \begin{bmatrix}s-2t \\ s \\ -t \\ t \\ 0\end{bmatrix} = s\begin{bmatrix}1 \\ 1 \\ 0 \\ 0 \\ 0\end{bmatrix} + t\begin{bmatrix}-2 \\ 0 \\ -1 \\ 1 \ 0\end{bmatrix}:=s{u}+t{v} $$ where $s,t\in \mathbb{R}.$ Therefore $\ker(A)=\operatorname{span}({u},{v}).$ Since $\operatorname{im}(A)$ is the span of the column vectors of $A$, we let ${v}_1,{v}_2,{v}_3,{v}_4,{v}_5$ be the column vectors of $A.$ From above, $\operatorname{im}(A) = \operatorname{span}({v}_1,{v}_2,{v}_3,{v}_4,{v}_5).$ Using $\operatorname{rref}(A)$ as a guide, we notice ${v}_2=(-1){v}_1$ so we eliminate ${v}_2.$ Also, ${v}_4=(2){v}_1+{v}_3$ so we also eliminate ${v}_4.$ Therefore, $\operatorname{im}(A)=\operatorname{span}({v}_1,{v}_2,{v}_5).$

Example. Give an example of a linear transformation whose kernel is the line spanned by ${w}=\begin{bmatrix}-1 \\ 1 \\ 2\end{bmatrix}.$

Solution. Considering the intersection of the planes $x+y=0$ and $2x+z=0$, we try to use the linear transformation, $$ T({x}) =T\begin{bmatrix}x \\ y \\ z\end{bmatrix} = \begin{bmatrix}x+y \\ 2x+z\end{bmatrix} =\begin{bmatrix} 1 & 1 & 0\\ 2 & 0 & 1 \end{bmatrix}{x}:=A{x}. $$ To find the kernel of $T$ we solve $A x= 0.$ Since $$ \operatorname{rref}(A)=\begin{bmatrix} 1 & 0 & 1/2 \\ 0 & 1 & -1/2 \end{bmatrix} $$ the solutions are of the form $t {w}$ where $t$ is a real number. Therefore it suffices to let $T$ be the requested linear transformation.

Example. Express the line $L$ in $\mathbb{R}^3$ spanned by the vector $$
{w}=\begin{bmatrix} 1 \\1 \\ 1 \end{bmatrix} $$ as the image of a matrix $A$ and as the kernel of a matrix $B.$

Solution. Let $A={w}$, then $L=\operatorname{im}(A)=\operatorname{span}({w}).$ Therefore it suffices to let $A$ be the requested matrix. Considering the intersection of the planes $x=y$ and $y=z$, we try to find $B$ using the linear transformation $$ T({x}) =T\begin{bmatrix}x \\ y \\ z\end{bmatrix}=\begin{bmatrix}x-y \\ y-z\end{bmatrix} =\begin{bmatrix} 1 & -1 & 0\\ 0 & 1 & -1 \end{bmatrix} := B {x} $$ To find the kernel of $T$ we solve $B x= 0.$ Since
$$ \operatorname{rref}(B)=\begin{bmatrix} 1 & 0 & -1 \\ 0 & 1 & -1 \end{bmatrix} $$ the solutions are of the form $t {w}$ where $t$ is a real number. Therefore it suffices to let $B$ be the other requested matrix.

Example. Find a basis for the kernel and the image of the linear transformations defined by $$ T_1 = \left \{ \begin{array}{rl} y_1 = & x_1+x_2+3x_3 \\ y_2 = & 2x_1+x_2+4x_3 \end{array} \right . \quad \text{with} \quad \text{rref}(T_1) = \begin{bmatrix} 1 & 0 &1 \\ 0 & 1 & 2 \end{bmatrix}. $$

Solution. All solutions to the system $Ax=0$ are $$ \begin{bmatrix}x_1 \\ x_2 \\ x_3\end{bmatrix}=t\begin{bmatrix}-1 \\ -2 \\ 1\end{bmatrix} $$ where $t\in\mathbb{R}.$ Since $\begin{bmatrix}-1 \\ -2 \\ 1\end{bmatrix}$ is linearly independent and spans the kernel, the vector $\begin{bmatrix}-1 \ -2 \ 1\end{bmatrix}$ forms a basis of $\ker(A).$ Since the columns of $A$ spans the image of $A$ and $$ \begin{bmatrix}3 \\ 4\end{bmatrix} =(1)\begin{bmatrix}1 \\ 2\end{bmatrix}+(2)\begin{bmatrix}1 \\ 1\end{bmatrix} $$ we find the vector $\begin{bmatrix}3 \ 4\end{bmatrix}$ redundant. Since the remaining vectors are linearly independent and span $\operatorname{im}(A)$, they form a basis for $\operatorname{im}(A).$

Example. Find a basis for the kernel and the image of the linear transformations defined by $$ T_2 = \begin{cases} y_1= & x_1 +3x_2 +9x_3 \\
y_2= & 4x_1+5 x_2 +8x_3 \\ y_3= & 7x_1+6 x_2 +3x_3 \\ \end{cases} \quad \text{with} \quad \text{rref}(T_2) = \begin{bmatrix} 1 & 0 & -3 \\ 0 & 1 & 4 \\ 0 & 0 & 0 \end{bmatrix} $$

Solution. To find a basis of the kernel we solve $Ax = 0$ where $A$ is the matrix of the given transformation. Since $$ \text{rref(A)} = \begin{bmatrix} 1 & 0 &-3 \\ 0 & 1 & 4 \\ 0 & 0 & 0 \end{bmatrix} $$ all solutions have the form $\begin{bmatrix}3t \ -4t \\ t\end{bmatrix}.$ Therefore a basis of the kernel of $A$ is $\begin{bmatrix}3 \\ -4 \\ 1\end{bmatrix}.$ In the original matrix $A$ the third column is redundant since $$ \begin{bmatrix}9 \\ 8 \\ 3\end{bmatrix} = (-3)\begin{bmatrix}1 \\ 4 \\ 7\end{bmatrix}+(4)\begin{bmatrix}3 \\ 5 \\ 6\end{bmatrix} $$ and since the vectors $\begin{bmatrix}1 \\ 4 \\ 7\end{bmatrix}$ and $\begin{bmatrix}3 \\ 5 \\ 6\end{bmatrix}$ are linearly independent and span, they form a basis of $\text{im}(A).$

Example. Find a basis for the kernel and the image of the linear transformations defined by $$ T_3 = \begin{cases} y_1= & 4x_1+8 x_2 +x_3 +x_4+6x_5 \\ y_2= & 3x_1+6 x_2 +x_3 +2x_4+5x_5 \\ y_3= & 2x_1+4 x_2 +x_3 +9x_4+10x_5 \\ y_4= & x_1+ 2x_2 + 3x_3 +2x_4 \\ \end{cases} $$ with $$ \operatorname{rref}(T_3) = \begin{bmatrix} 1 & 2 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 &1 \end{bmatrix} $$

Solution. The solutions to the system $Ax=0$ are $$ x=\begin{bmatrix}x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5\end{bmatrix}=t \begin{bmatrix}-2 \\ 1 \\ 0 \\ 0 \\ 0\end{bmatrix}=t v \qquad \text{where $t\in\mathbb{R}.$} $$ The vector $v$ is linearly independent and spans $\ker(A)$; thus it forms a basis for $\ker(A).$ Since $$ \begin{bmatrix}8 \\ 6 \\ 4 \\ 2\end{bmatrix} =2\begin{bmatrix}4 \\ 3 \\ 2 \\ 1\end{bmatrix} $$ the vector $\begin{bmatrix}8 \ 6 \ 4 \ 2\end{bmatrix}$ is redundant and since the column vectors of $A$ are linearly independent and span $\operatorname{im}(A)$, we have that the vectors $\begin{bmatrix}4 \\ 3 \\ 2 \\ 1\end{bmatrix}$, $\begin{bmatrix}1 \\ 1 \\ 1 \\ 3\end{bmatrix}$, $\begin{bmatrix}1 \\ 2 \\ 9 \\ 2\end{bmatrix}$, and $\begin{bmatrix}6 \\ 5 \\ 10 \\ 0\end{bmatrix}$ form a basis of $\operatorname{im}(A).$

Example. Show $\ker(A)\neq \ker(B)$ where $$ A= \begin{bmatrix} 1 & 0 & 2 & 0 & 4 & 0 \\ 0 & 1 & 3 & 0 & 5 & 0 \\ 0 & 0 & 0 & 1 & 6 & 0 \\ 0 & 0 & 0 & 0 & 0 & 1 \end{bmatrix} $$ $$ B= \begin{bmatrix} 1 & 0 & 2 & 0 & 0 & 4 \\
0 & 1 & 3 & 0 & 0 & 5 \\ 0 & 0 & 0 & 1 & 0 & 6 \\ 0 & 0 & 0 & 0 & 1 & 7 \end{bmatrix}. $$

Solution. We solve the system $Ax=0$, written out we find: $$ \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5 \\ x_6 \end{bmatrix} = s \begin{bmatrix} -4 \\ -5 \\ 0 \\ 6 \\ 1 \\ 0 \end{bmatrix} +t \begin{bmatrix}
-2 \\ -3 \\ 1\\ 0 \\ 0 \\ 0 \end{bmatrix} =sv_1+tv_2 \quad \text{ where $s,t\in \mathbb{R}$}. $$ Since $v_1$ and $v_2$ are linearly independent and span $\ker(A)$, they form a basis of the kernel of $A.$ By noticing $B v_1\neq 0$ we conclude $\ker(A)\neq \ker(B).$

Theorem. For any matrix $A$, $\dim(\operatorname{im} A )=\mathop{rank}(A).$

The Rank-Nullity Theorem

The Rank-Nullity Theorem is sometimes called the Fundamental Theorem of Linear Algebra.

Theorem. (Rank-Nullity) Let $T$ be a linear transformation from $\mathbb{R}^m$ to $ \mathbb{R}^n$ with $n\times m$ matrix $A.$ Then $$ \text{rank}(A) + \operatorname{nullity}(A) =\dim(\operatorname{im} A)+\dim(\ker A)=m. $$

Proof. Let ${y}=A{x}$ be the corresponding system of linear equations. Recall that for any linear system with $m$ variables, \begin{equation*} \label{mvar} \begin{pmatrix} \text{ number } \\ \text{ of free } \\ \text{ variables } \end{pmatrix} = \begin{pmatrix} \text{ total } \\ \text{ number }\\ \text{ of variables } \end{pmatrix} – \begin{pmatrix} \text{ number } \\ \text{ of leading }\\ \text{ variables } \end{pmatrix} = m-\operatorname{rank}(A) \end{equation*} Apparently, the number of free variables is the dimension of the kernel of $A.$ Thus $\operatorname{nullity}(A)=m-\text{rank}(A).$ We arrive at the conclusion $\text{rank}(A)+\operatorname{nullity}(A)=m.$

Theorem. The vectors ${v}_1,\ldots,{v}_n$ in $\mathbb{R}^n$ form a basis of $\mathbb{R}^n$ if and only if the matrix whose columns consists of ${v}_1, \ldots,{v}_n$ is invertible.

For example, consider the matrix $$ A= \begin{bmatrix} 1 & 2 & 1 & 2 \\ 1 & 2 & 2 & 3 \\ 1 & 2 & 3 & 4 \end{bmatrix}. $$ What is the smallest number of vectors needed to span the image of $A$? Of course we know, $$ \operatorname{im}(A)=\operatorname{span}\left(\begin{bmatrix}1 \\ 1 \\ 1\end{bmatrix},\begin{bmatrix}2 \\ 2 \\ 2\end{bmatrix},\begin{bmatrix}1 \\ 2 \\ 3\end{bmatrix},\begin{bmatrix}2 \\ 3 \\ 4\end{bmatrix}\right). $$ However, it is easy to show that $\begin{bmatrix}2 \\ 3 \\ 4\end{bmatrix}$ and $\begin{bmatrix}2 \\ 2 \\ 2\end{bmatrix}$ are redundant; and that the remaining vectors are linearly independent. Thus, $$ \operatorname{im}(A)=\operatorname{span}\left(\begin{bmatrix}1 \\ 1 \\ 1\end{bmatrix},\begin{bmatrix}1 \ 2 \ 3\end{bmatrix}\right). $$ Clearly, the image of the linear transformation defined by $A$ is more easily understood by having a spanning set of linearly independent vectors.

Exercises on the Kernel of a Matrix

Exercise. Find the reduced row-echelon form of the matrix $$ A=
\begin{bmatrix} 1 & 2 & 3 & 2 & 1 \\ 3 & 6 & 9 & 6 & 3 \\ 1 & 2 & 4 & 1 & 2 \\ 2 & 4 & 9 & 1 & 2 \end{bmatrix}. $$ Find a basis and state the dimension for the image and kernel of $A.$

Exercise. If possible, find a $3\times 3$ matrix such that $\operatorname{im} A=\ker A.$

Exercise. If possible, find a $4\times 4$ matrix such that $\operatorname{im} A=\ker A.$

Exercise. Give an example of a $4\times 5$ matrix $A$ with $\dim(\ker A)=3.$

David A. Smith at Dave4Math

David Smith (Dave) has a B.S. and M.S. in Mathematics and has enjoyed teaching precalculus, calculus, linear algebra, and number theory at both the junior college and university levels for over 20 years. David is the founder and CEO of Dave4Math.

Leave a Comment