Linear Transformation (and Characterization)

I discuss linear transformations between finite-dimensional vector spaces over the real numbers. In particular, I prove that the definition of a linear transformation (that of the existence of a matrix representation) is equivalent to the alternative definition of a linear transformation (that of preserving the vector space operations) are equivalent.

Definition of Linear Transformation

A linear transformation is a function of the form $y =A x$ where $A$ is an $n\times m$ matrix. More specifically, a linear transformation is a function that assigns to each $x\in \mathbb{R}^m$, a unique $y\in \mathbb{R}^n$ — and this assignment is defined by a matrix $A.$ When $A$ is the identity matrix and $T(x)=A x$ we call $T$ the identity transformation.

Definition. A function $T$ from $\mathbb{R}^m$ to $\mathbb{R}^n$ is called a linear transformation if there exists an $n\times m$ matrix $A$ such that $T(x)=A x$, for all $x$ in the vector space $R^m.$

Lemma. Let $T$ be a linear transformation from $\mathbb{R}^m$ to $\mathbb{R}^n$, then the matrix of $T$ is \begin{equation} \label{trancol} A=\begin{bmatrix} | & & | \ T(e_1) & \cdots & T(e_m) \ | & & | \end{bmatrix} \end{equation} where $e_i$ (for $0\leq i \leq m$) are the standard vectors.

Proof. Suppose $T$ is a linear transformation from $\mathbb{R}^m$ to $\mathbb{R}^n$, then there exists an $n\times m$ matrix $A$ such that $T(x)=Ax$ for all $x\in \mathbb{R}^m.$ Let $e_1, \dots, e_m$ be the standard vectors of $\mathbb{R}^m$ and let $A=[a_{ij}]$, then

\begin{align} T(e_1) = A e_1 = \begin{bmatrix} a_{11} & \cdots & a_{1m} \\ \vdots & \cdots & \vdots \\ a_{n1} & \cdots & a_{nm} \end{bmatrix} \begin{bmatrix}1 \\ \vdots \\ 0 \end{bmatrix} = \begin{bmatrix}a_{11}\\ \vdots \\ a_{n1}\end{bmatrix} \end{align} $$ \vdots $$ $$ T(e_m)=A e_m= \begin{bmatrix} a_{11} & \cdots & a_{1m} \\ \vdots & \cdots & \vdots \\ a_{n1} & \cdots & a_{nm} \end{bmatrix} \begin{bmatrix}0\\ \vdots \\ 1\end{bmatrix} =\begin{bmatrix}a_{1m} \\ \vdots \\ a_{nm} \end{bmatrix} $$

which are the columns of the matrix $A.$

Example. Determine the linear transformation $T$ given by the system of linear equations: $$ \begin{array}{l} y_1= 7x_1+3x_2-9x_3+8x_4 \\ y_2 = 6x_1+2x_2-8x_3+7x_4 \\ y_3 = 8x_1+4x_2+7x_4 \end{array} $$

Solution. The matrix of the linear transformation is $A= \begin{bmatrix} 7 & 3 & -9 & 8 \\ 6 & 2 & -8 & 7 \\ 8 & 4 & 0 & 7 \end{bmatrix} $ since $$ T(e_1)=\begin{bmatrix} 7\\ 6\\ 8\end{bmatrix}, \qquad T(e_2)=\begin{bmatrix} 3\\ 2\\ 4 \end{bmatrix}, \qquad T(e_3) = \begin{bmatrix}-9\\ -8\\ 0 \end{bmatrix}, \qquad T(e_4)=\begin{bmatrix}8\\ 7\\ 7\end{bmatrix}. $$ Notice $T$ is a linear transformation from $\mathbb{R}^4$ to $\mathbb{R}^3$ and $A$ is a $3\times 4$ matrix.

Example. Is the transformation $T({x})={v}\cdot {x}$ from $\mathbb{R}^3$ to $\mathbb{R}$ a linear transformation? If so, find the matrix of $T.$

Solution. Let ${v}=\begin{bmatrix}v_1 \ v_2\ v_3\end{bmatrix}.$ Then $$ T({x})={v}\cdot {x}=\begin{bmatrix}v_1\\ v_2\\ v_3\end{bmatrix} \cdot \begin{bmatrix}x_1 \ x_2\ x_3 \end{bmatrix} = v_1 x_1+v_2 x_2+v_3 x_3 = \begin{bmatrix}v_1 & v_2 & v_3 \end{bmatrix}{x}.$$ Therefore, $T$ is a linear transformation with matrix $$ \begin{bmatrix}v_1 & v_2 & v_3 \end{bmatrix}. $$ as desired.

Example. Is the transformation $T({x})={v}\times {x}$ from $\mathbb{R}^3$ to $\mathbb{R}$ a linear transformation? If so, find the matrix of $T.$

Solution. Let ${v}=\begin{bmatrix}v_1 \\ v_2 \\ v_3 \end{bmatrix}.$ Then $$ T({x})={v}\times {x} = \begin{bmatrix}v_1\\ v_2 \\ v_3 \end{bmatrix} \times \begin{bmatrix} x_1 \\ x_2\\ x_3\end{bmatrix} =\begin{bmatrix}v_2x_3 -v_3x_2\\ v_3x_1-v_1x_3 \\ v_1 x_2-v_2x_1 \end{bmatrix} =\begin{bmatrix}0 & -v_3 & v_2 \\ v_3 & 0 & -v_1 \\ -v_2 & v_1 & 0 \end{bmatrix}{x} $$ Therefore, $T$ is a linear transformation with matrix $$ \begin{bmatrix}0 & -v_3 & v_2\ v_3 & 0 & -v_1 \ -v_2 & v_1 & 0 \end{bmatrix}. $$ as desired.

Characterization of a Linear Transformation

Theorem. A function $T$ from $\mathbb{R}^m$ to $\mathbb{R}^n$ is a linear transformation if and only if both of the following hold:

(1) $T(v+ w)=T(v)+T(w)$ for all vectors $v$ and $w$ in $\mathbb{R}^m$, and

(2) $T(k v)=k T(v)$ for all vectors $v$ in $\mathbb{R}^m$ and all scalars $k.$

Proof. Suppose $T$ is a linear transformation from $\mathbb{R}^m$ to $\mathbb{R}^n$, then there exists an $n\times m$ matrix $A$ such that $T(x)=Ax$ for all $x\in \mathbb{R}^m.$ The proof of each part follows.

  • Let $u, v\in \mathbb{R}^m$, then $T(v+w)=A (v+w)=A v+A w=T(v)+T(w).$
  • Let $v\in \mathbb{R}^m$ and $k\in \mathbb{R}.$ Then $T(k v)=A(k v)=(A k) v=k(A v)=k T(v).$

Now suppose both (i) and (ii) hold. We need to find a matrix $A$ such that $Tx =A x$ for all $x\in \mathbb{R}^m.$ We can use the standard vectors in $\mathbb{R}^m.$ Then by \eqref{trancol}, \begin{align*} T(x) & = T(x e_1+\cdots + x_m e_m) =T(x_1e_1)+\cdots +T(x_me_m) \\ & = x_1 T(e_1)+\cdots +x_m T(e_m) \begin{bmatrix} | & & | \\ T(e_1) & \cdots & T(e_m) \\ | & & | \end{bmatrix} \begin{bmatrix}x_1 \\ \vdots \\ x_m \end{bmatrix} = Ax \end{align*} as desired.

Example. Write $\begin{bmatrix} -1\\ 1\\ 0 \end{bmatrix}$ as a linear combination of $\begin{bmatrix} 3 \\ -1 \\ 2 \end{bmatrix}$ and $\begin{bmatrix}1\\ 0\\ 1\end{bmatrix}.$

Solution. Let $T:\mathbb{R}^3\to\mathbb{R}$ be a linear transformation with $T\begin{bmatrix} 3\\ -1\\ 2\end{bmatrix}=5$ and $T\begin{bmatrix}1\\ 0\\ 1 \end{bmatrix}=2.$ Find $T\begin{bmatrix} -1\\ 1\\ 0 \end{bmatrix}.$ Notice $$ \begin{bmatrix} -1\\ 1\\ 0 \end{bmatrix}=
(-1)\begin{bmatrix} 3\\ -1\\ 2 \end{bmatrix} + 2\begin{bmatrix} 1\\ 0\\ 1 \end{bmatrix}. $$ Therefore, $$ T\begin{bmatrix} -1\\ 1\\ 0 \end{bmatrix} =T\left((-1)\begin{bmatrix} 3 \\ -1\\ 2 \end{bmatrix}+2\begin{bmatrix} 1\\ 0\\ 1 \end{bmatrix}\right) =(-1)\, T\begin{bmatrix} 3\\ -1\\ 2 \end{bmatrix}+2 \, T\begin{bmatrix} 1\\ 0\\ 1\end{bmatrix} =-1. $$ as desired.

Example. Let $T:\mathbb{R}^2\to\mathbb{R}^2$ be defined by $T\begin{bmatrix} x_1\\ x_2 \end{bmatrix} = \begin{bmatrix} 2x_1\\ x_2^2\end{bmatrix}.$ Is $T$ a linear transformation?

Solution. Let $\alpha=\begin{bmatrix}x_1\\ x_2\end{bmatrix}$ and $\beta=\begin{bmatrix} y_1\\ y_2\end{bmatrix}.$ Then

\begin{align} T(\alpha+\beta) & =T\left(\begin{bmatrix}x_1\\ x_2 \end{bmatrix}+\begin{bmatrix} y_1\\ y_2 \end{bmatrix}\right) =T\begin{bmatrix}x_1+y_1\\ x_2+y_2\end{bmatrix} =\begin{bmatrix}2(x_1+y_1)\\ (x_2+y_2)^2\end{bmatrix} \end{align}

On the other hand

\begin{align} T(\alpha)+T(\beta) & =T\begin{bmatrix} x_1\\ x_2 \end{bmatrix}+T\begin{bmatrix} y_1\\ y_2\end{bmatrix} =\begin{bmatrix} 2x_1\\ x_2^2 \end{bmatrix} + \begin{bmatrix} 2y_1 \\ y_2^2 \end{bmatrix} = \begin{bmatrix}2(x_1+y_1)\\ x_2^2+y_2^2\end{bmatrix} \end{align}

Since $T(\alpha+\beta)\neq T(\alpha)+T(\beta)$, we conclude that $T$ is not linear transformation.

Theorem. Let $T$ be a linear transformation from $\mathbb{R}^m$ to $\mathbb{R}^n.$

If ${0}_m$ is the zero vector in $\mathbb{R}^m$, then $T({0}_m)$ is the zero vector in $\mathbb{R}^n.$

For all ${v}$ in $\mathbb{R}^m$, $T(-{v})=-T({v}).$

For all ${u}, {v}$ in $\mathbb{R}^m$, $T({u}-{v})=T({u})-T({v}).$

For all $a_1,\dots,a_n\in \mathbb{R}$ and for all ${v}_1, \dots, {v}_n\in \mathbb{R}^m$, $$ T(a_1{v}_1+a_2{v}_2+\cdots + a_n{v}_n) =a_1T({v}_1)+a_2T({v}_2)+\cdots+a_nT({v}_n).$$

David A. Smith at Dave4Math

David Smith (Dave) has a B.S. and M.S. in Mathematics and has enjoyed teaching precalculus, calculus, linear algebra, and number theory at both the junior college and university levels for over 20 years. David is the founder and CEO of Dave4Math.

Leave a Comment