Matrices and Vectors (and their Linear Combinations)

Okay, so you have some familiarity with matrices. In this article, I go more in-depth. I discuss matrix operations and work through several proofs concerning their basic properties. After that, I explain linear combinations of vectors and provide many examples and exercises.

We discuss several types of matrices and matrix operations. Perhaps the most important concept in linear algebra is that of linear combination. We study the connection between vectors, matrices, and linear combinations.

Special Types of Matrices

The entries of an $n\times m$ matrix $A$ are denoted by $a_{i,j}$ where $1\leq i\leq n$ and $1\leq j \leq m.$ For example, the matrix $$ \begin{bmatrix}a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23}\end{bmatrix} $$ is a $2\times 3$ matrix with $i=2$ rows and $j=3$ columns. If $i=j$, then the matrix is called a square matrix.

Definition. Two matrices $A=[a_{ij}]$ and $B=[b_{ij}]$ of the same size are equal when $a_{ij}=b_{ij}$ for all $i,j.$

A square matrix $A=[a_{ij}]$ is called a diagonal matrix if $a_{ij}=0$ whenever $i\neq j.$ A square matrix $A=[a_{ij}]$ is called upper triangular (lower triangular) when $a_{ij}=0$ whenever $i>j$ ($i<j$). The zero matrix $0_{m\times n}$ has all entries zero and the identity matrix has $a_{ii}=1$, $i=1, 2, 3,\ldots,n$ and all other entries zero. We remark that when $m$ and $n$ are understood from discussion they are usually left off. $$ 0_{m\times n}= \begin{bmatrix} 0 & \cdots & 0 \\ 0 & \cdots & 0 \\ \vdots & \cdots & \vdots \\ 0 & \cdots & 0 \end{bmatrix}_{m\times n} \qquad I_n= \begin{bmatrix} 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0\\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \end{bmatrix}_{n\times n} $$

Consider the following matrices. $$ \begin{array}{lllll} A=\begin{bmatrix}1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix} & B=\begin{bmatrix} 2 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 0 \end{bmatrix} & C=\begin{bmatrix} 2 & 3 \\ 0 & 4 \end{bmatrix} & D=\begin{bmatrix}5 & 0 & 0 \\ 4 & 0 & 0 \\ 3 & 2 & 1\end{bmatrix} \end{array} $$ Notice matrices $B$, $C$ and $D$ are square and $A$ is not. Also notice the only diagonal matrix is $B$, matrices $B$ and $C$ are upper triangular, and the only lower triangular matrix is $D.$

Matrix Operations

We will see that matrix addition has many of the same properties that scalar addition has; for example, commutativity, associativity, additive inverses, and the additive identity property.

Definition. Let $A=[a_{ij}]$ and $B=[b_{ij}]$ be matrices of the same size. The matrix sum of $A$ and $B$ is the matrix $C$ defined by $$ C=A+B=[a_{ij}]+[b_{ij}]=[a_{ij}+b_{ij}]. $$

Theorem. Let $A$, $B$, and $C$ denote arbitrary matrices such that the following operations are defined. Then the following hold.

(1) $A+B=B+A$,

(2) $A+(B+C)=(A+B)+C$,

(3) for each $A$, there exists $0$ such that $0+A=A$, and

(4) for each $A$, there exists a matrix $-A$ such that $A+(-A)=0.$

Proof. For the first part, let $A$ and $B$ denote matrices of the same size with entries $A_{ij}$ and $B_{ij}$, respectively. Then \begin{align*} (A+B)_{ij} & =A_{ij}+B_{ij} & \text{definition of matrix addition} \\ & =B_{ij}+A_{ij} & \text{scalar commutativity} \\ & =(B+A)_{ij} & \text{definition of matrix addition} \end{align*} Therefore, by the definition of matrix equality, $A+B=B+A$ follows.

For the second part, if matrix $C$ is also of the same size as $A$ and $B$ with entries $C_{ij}$ then, \begin{align*} (A+(B+C))_{ij} & =A_{ij}+\left(B+C\right)_{ij} & \text{definition of matrix addition} \\ & =A_{ij}+\left(B_{ij}+C_{ij}\right) & \text{definition of matrix addition} \\ & =\left(A_{ij}+B_{ij}\right)+C_{ij} & \text{scalar associativity} \\ & =\left(A+B\right)_{ij}+C_{ij} & \text{definition of matrix addition} \\ & =(\left(A+B\right)+C)_{ij} & \text{definition of matrix addition} \end{align*} Therefore, $A+(B+C)=(A+B)+C$ follows.

For the third part, let $A$ be an $m\times n$ matrix with entires $A{ij}$ and let $0_{m\times n}$ denote the $m\times n$ matrix with all entires equal to the scalar additive identity denoted by $0.$ Denoting the entries of $0_{m\times n}$ by $0_{ij}$ we find \begin{align*} (0+A)_{ij}=0_{ij}+A_{ij} = A_{ij}. \end{align*} Therefore, by the definition of matrix equality, $0+A=A$ follows. The proof of the remaining part is left for the reader as an exercise.

The matrix properties listed in Properties of Matrix Addition are called the commutative, associative, additive identity, and additive inverse properties for matrix addition, respectively.

Just as matrix addition has several properties in common with the underlying field of scalars so too does scalar multiplication.

Definition. If $A$ is any matrix and $k$ is any scalar, the scalar multiple $kA$ is the matrix obtained by multiplying each entry of $A$ by $k$, that is if $A=[a_{ij}]$, then $k A=[k a_{ij}].$

The following properties say that scalers and matrices distribute over each other and are associative with each other.

Theorem. Let $A$ and $B$ denote arbitrary matrices such that the following operations are defined. Let $k$ and $p$ denote arbitrary scalars. Then the following hold.

(1) $k(A+B)=k A+k B$

(2) $(k+p)A=k A+p A$

(3) $(kp)A=k(pA)$

(4) $A(kB)=(kA)B$

Proof. For the first part, let $A$ and $B$ be $n\times m$ matrices. Then, for $1\leq i \leq n$ and $1\leq j\leq m$ and an arbitrary scalar $k$ we find, \begin{align} \left(k(A+B)\right)_{ij}& =k\left((A+B)_{ij}\right) & \text{definition of scalar multiplication} \\ & =k(A_{ij}+B_{ij})& \text{definition matrix addition} \\ & =(kA)_{ij}+(kB)_{ij}& \text{distributive property of scalar multiplication} \\ & =(kA+kB)_{ij}& \text{definition of matrix addition} \end{align} Therefore, by the definition of matrix equality, $k(A+B)=k A+k B$ follows.

For the second part, let $A$ be an $n\times m$ matrix and let $k$ and $p$ be scalars. Then we find, \begin{align} ((kp)A)_{ij}& =(kp)A_{ij}& \text{definition of scalar multiplication} \\ & =k(pA_{ij})& \text{associative property of scalars} \\ & =k(pA)_{ij}& \text{definition of scalar multiplication} \\ & =(k(pA))_{ij}& \text{definition of scalar multiplication}\end{align} Therefore, by the definition of matrix equality, $(kp)A=k(pA)$ follows. The proof of the remaining parts are left for the reader as an exercise.

Example. If $A= \begin{bmatrix} 3 & -1 & 4 \\ 2 & 0 & 6 \end{bmatrix}$ and $B=\begin{bmatrix} 1 & 2 & -1\\ 0 & 3 & 2 \end{bmatrix}$, determine $5A$, $\frac{1}{2}B$, and $3A-2B.$ A quick computation reveals the following: $$ 5A= \begin{bmatrix} 15 & -5 & 20 \\ 10 & 0 & 30 \end{bmatrix} \quad \frac{1}{2}B= \begin{bmatrix} \frac{1}{2} & 1 & -\frac{1}{2} \\ 0 & \frac{3}{2} & 1 \end{bmatrix} \quad 3A-2B= \begin{bmatrix} 7 & -7 & 14 \\ 6 & -6 & 14 \end{bmatrix} $$ as needed.

Example. Let $A$, $B$, and $C$ be matrices of the same size. Simplify $$ 2(A+3C)-3(2C-B)-3[2(2A+B-4C)-4(A-2C)]. $$ We obtain $2 A – 3 B.$

Example. If $kA=0$, show that either $k=0$ or $A=0.$ Suppose $k\neq 0$ and $A=[a_{ij}]$, then $kA=[k a_{ij}]=0$ and notice each entry $k a_{ij}$ is zero if and only if $a_{ij}=0$ for every $i$ and $j.$ Therefore, either $k=0$ or every entry in $A$ is zero.

Example. Find $1\times 3$ matrices $X$ and $Y$ such that $$ \begin{cases} X+2Y= \begin{bmatrix} 1 & 3 & -2 \end{bmatrix} \\ X+Y= \begin{bmatrix} 2 & 0 & 1 \end{bmatrix}. \end{cases} $$ Subtracting these equations together yields $Y= \begin{bmatrix} -1 & 3 & -3 \end{bmatrix} .$ Then $$ X=\begin{bmatrix} 2 & 0 & 1 \end{bmatrix} -Y= \begin{bmatrix} 3 & -3 & 4 \end{bmatrix}. $$

Let $A$ be an $m\times n$ matrix and $B$ a $n\times p$ matrix, their matrix product $C=AB$ is defined as the the $m\times p$ matrix whose $(i,j)$th entry is the dot product of the $i$th row of $A$ with the $j$th column of $B$, that is, the product of two matrices is the matrix $C$ where $$ c_{ij}=\sum_{k=1}^n a_{ik}b_{kj}. $$ Matrix multiplication is illustrated by the following equation.

\begin{align*} \label{matprod} AB & = \begin{bmatrix} a_{ij} \end{bmatrix} \begin{bmatrix} b_{ij} \end{bmatrix} \begin{bmatrix} {\sum_{k=1}^n a_{1k}b_{k1}} & \cdots & {\sum_{k=1}^n a_{1k}}b_{kp} \\ \vdots & {\sum_{k=1}^n a_{ik}b_{kj}} & \vdots \\ {\sum_{k=1}^n a_{mk}b_{k1}} & \cdots & {\sum_{k=1}^n a_{mk}}b_{kn} \end{bmatrix} \\ & = \begin{bmatrix} c_{ij} \end{bmatrix} =C \end{align*}

Example. Let $A$ and $B$ be defined as follows. \begin{equation*} \label{MatrixMult} A= \begin{bmatrix} 4 & -5 & 6 \\ 1 & 8 & -9 \end{bmatrix} \qquad \qquad B= \begin{bmatrix} 1 & 2 & -3 \\ -2 & -1 & 5 \\ -2 & 5 & -1 \end{bmatrix} \end{equation*} If possible, find $AB$ and $BA.$ Since $A$ is a $2\times 3$ matrix and $B$ is a $3\times 3$ matrix we see that $AB$ is defined as a $2\times 3$ matrix and $BA$ is not defined.

\begin{align*} AB & \begin{bmatrix} \begin{bmatrix} 4 & -5 & 6 \end{bmatrix}\cdot\begin{bmatrix}1\\-2\\-2\end{bmatrix} & \begin{bmatrix} 4 & -5 & 6 \end{bmatrix}\cdot\begin{bmatrix}2\\-1\\5\end{bmatrix} & \begin{bmatrix} 4 & -5 & 6 \end{bmatrix}\cdot\begin{bmatrix}-3\\5\\-1\end{bmatrix} \hspace{5pt} \\ \begin{bmatrix} 1 & 8 & -9 \end{bmatrix}\cdot\begin{bmatrix}1\\-2\\-2\end{bmatrix} & \begin{bmatrix} 1 & 8 & -9 \end{bmatrix}\cdot\begin{bmatrix}2\\-1\\5\end{bmatrix} & \begin{bmatrix} 1 & 8 & -9 \end{bmatrix}\cdot\begin{bmatrix}-3\\5\\-1\end{bmatrix} \hspace{5pt} \end{bmatrix} \\ & = \begin{bmatrix} 2 & 43 & -19 \\ 3 &-51 & 52 \end{bmatrix} \end{align*}

Notice that this is a counterexample showing that matrix multiplication is not commutative. Even so, matrix multiplication has many useful properties such as associativity, multiplicative inverses, and distributive properties.

Theorem. Let $A$, $B$, and $C$ denote arbitrary matrices such that the following operations are defined. Then the following hold.

(1) $A(BC)=(AB)C$

(2) $A(B+C)=AB+AC$

(3) $(B+C)A=BA+CA$

(4) $AI=A$

(5) $IA=A$

Proof. For the first part, suppose $B$ is an $n\times p$ matrix. Since $BC$ is defined, $C$ is a matrix with $p$ rows, and $BC$ has $n$ rows. Because $A(BC)$ is defined we may assume $A$ is an $m\times n$ matrix. Thus the product $AB$ exists and it is an $m\times p$ matrix, from which it follows that the product $(AB)C$ exists. To show that $A(BC)=A(BC)$ we must show that \begin{equation*} \label{matass} [A(BC)]_{ij}=[(AB)C]_{ij}. \end{equation*} for arbitrary $i$ and $j.$ By definition of matrix multiplication,

\begin{align} [A(BC)]_{ij} & = \sum_t A_{it}(BC)_{tj} = \sum_t A_{it} \sum_s B_{t s} C_{sj} \\ & = \sum_t \sum_s A_{it} B_{ts} C_{sj} =\sum_s \sum_t A_{it} B_{ts} C_{sj} \\ & =\sum_s \left( \sum_t A_{it} B_{ts} \right) C_{sj} =\sum_s (AB_{i s}) C_{sj} =[(AB)C]_{ij}. \end{align}

For the second part, let $A_{ij}$, $B_{ij}$, and $C_{ij}$ denote the entries of matrices $A$, $B$ and $C$ of sizes $m\times n$, $n\times p$, and $n\times p$, respectively. Then $B+C=[B_{ij}+C_{ij}]$, so the $(i,j)$th entry of $A(B+C)$ is \begin{align*} \sum_{k=1}^n A_{ik}\left(B_{kj}+C_{kj}\right) & =\sum_{k=1}^n \left(A_{ik} B_{kj}+A_{ik} C_{kj}\right) \\ & =\sum_{k=1}^n A_{ik} B_{kj}+\sum_{k=1}^n A_{ik} C_{kj} \end{align*} this is the $(i,j)$th entry of $AB+AC$ because the sums on the right are the $(i,j)$th entries of $AB$ and $AC$, respectively. Hence $A(B+C)=AB+AC.$

For the third part, let $A=[a_{ij}]_{m\times n}.$ Then $$ AI= \begin{bmatrix} a{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \ddots & \ddots & \vdots \\ a_{m_1} & a_{m2} & \cdots & a_{mm} \end{bmatrix}_{m\times n} \begin{bmatrix} 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \end{bmatrix}_{n\times n}$$ and the $(i,j)$th entry of $AI$ is the $i$th row of $A$ times the $j$th column of $I$, that is

$$ \fbox{$\begin{matrix} a_{i1} & a_{i2} & \cdots & a_{ij} & \cdots & a_{in} \end{matrix}$} \quad \begin{array}{rl} \fbox{$ \begin{matrix} 0 \\ 0 \\ \vdots \\ 1 \\ \vdots \\ 0 \end{matrix} $} & \begin{matrix} \text{ } \\ \text{ } \\ \text{ } \\ \leftarrow \text{Position $j$} \\ \text{ } \\ \text{ } \end{matrix} \end{array} $$

So the $(i,j)$th entry of $AI$ is $a_{ij}$, the $(i,j)$th entry of $A.$ Thus, $AI=A.$ The proof of the remaining parts are left for the reader as an exercise.

The matrix properties listed in Properties of Matrix Multiplication are called the associative, left distributive, right distributive, left identity, and right identity property for matrix multiplication, respectively.

Example. Find all matrices that commute with $ A=\begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix}.$

Solution. Let $B=\begin{bmatrix} a & b \\ c & d\end{bmatrix}$ be a matrix that commutes with $A.$ Matrix multiplication yields $$ AB=\begin{bmatrix} a+2c & b+2d \\ c & d\end{bmatrix} \qquad \text{and} \qquad BA=\begin{bmatrix} a & 2a+b \\ c & 2c+d \end{bmatrix}. $$ Since $a+2c=a$ it follows that $c=0.$ Since $b+2d=2a+b$ it follows that $a=d.$ Thus all matrices that commute with $A$ have the form $ \begin{bmatrix} a & b \\ 0 & a \end{bmatrix}. $

Example. Write the matrix equation for the system $$ \begin{cases} 3x_1+x_2+7x_3+2x_4 =13 \\ 2x_1-4x_2+14x_3-x_4 =-10 \\ 5x_1+11x_2-7x_3+8x_4 =59 \\ 2x_1+5x_2-4x_3-3x_4=39. \end{cases} $$ Find the reduced row echelon form using Gauss-Jordan elimination. Solve the system. The matrix equation is $$ \begin{bmatrix} 3 & 1 & 7 & 2 \\ 2 & -4 & 14 & -1 \\ 5 & 11 & -7 & 8 \\ 2 & 5 & -4 & -3 \end{bmatrix} \begin{bmatrix}x_1\\ x_2\\ x_3\\ x_4\end{bmatrix} =\begin{bmatrix}13\ -10\ 59\ 39\end{bmatrix}. $$ The reduced row echelon form of the coefficient matrix and the solution set are as follows $$ \begin{bmatrix} 1 & 0 & 3 & 0 & 4 \\ 0 & 1 & -2 & 0 & 5 \\ 0 & 0 & 0 & 1 & -2 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \qquad \left\{\begin{bmatrix}4-3t\\ 5+2c\\ t\\ -2\end{bmatrix} \mid t\in \mathbb{R}\right\} $$

Linear Combinations of Vectors

A matrix with a single column is called a column vector or just vector and a matrix with a single row is called a row vector.

Definition. A vector ${b}$ is called a linear combination of the vectors
$v_1, v_2, \ldots, v_m $ if there exists scalars $a_1, a_2, \ldots, a_m$ such that $$ {b}=a_1 v_1 + a_2 v_2 + \cdots + a_m v_m . $$

Example. Given the following vectors, is ${u}$ a linear combination of the vectors ${v}_1$ and ${v}_2$? $$ {u}=\begin{bmatrix} 7 \\ 8 \\ 9 \end{bmatrix} \qquad {v}_1=\begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} \qquad {v}_2=\begin{bmatrix} 4 \\ 5 \\ 6 \end{bmatrix} $$ We want to find scalars $x_1$ and $x_2$ such that ${u}=x_1{v}_1+x_2{v}_2.$ We solve for $x_1$ and $x_2$ using an augmented matrix and row-operations as follows.

$$ \begin{bmatrix} 1 & 4 & 7 \\ 2 & 5 & 8 \\ 3 & 6 & 9 \end{bmatrix} \begin{array}{c} \stackrel{\longrightarrow}{-2R_1+R_2} \\ \stackrel{\longrightarrow}{-3R_1+R_3} \end{array} \begin{bmatrix} 1 & 4 & 7 \\ 0 & -3 & -6 \\ 0 & -6 & -12 \end{bmatrix}\begin{array}{c} \stackrel{\longrightarrow}{-\frac{1}{3}R_2} \\ \stackrel{\longrightarrow}{-\frac{1}{6}R_3} \end{array} \begin{bmatrix} 1 & 4 & 7 \\ 0 & 1 & 2 \\ 0 & 1 & 2 \end{bmatrix} $$ $$ \stackrel{\longrightarrow}{-R_2+R_3} \begin{bmatrix} 1 & 4 & 7 \\ 0 & 1& 2 \\ 0 & 0 & 0 \end{bmatrix} \stackrel{\longrightarrow}{-4R_2+R_1} \begin{bmatrix} 1 & 0 & -1 \\ 0 & 1 & 2 \\ 0 & 0 & 0 \end{bmatrix} $$

Thus we find $x_1=-1$ and $x_2=2.$ Therefore, yes ${u}$ is a linear combination of ${v}_1$ and ${v}_2$, namely ${u}=2{v}_2-{v}_1.$

Example. Given the following vectors, determine whether ${u}$ or ${w}$ is a linear combination of the vectors $v_1$ and $v_2.$ $$ {u} =\begin{bmatrix}1\\ 1\\ 4 \end{bmatrix} \qquad {w}=\begin{bmatrix}1\\ 5 \\ 1\end{bmatrix} \qquad v_1=\begin{bmatrix}1\\ 2\\ -1\end{bmatrix} \qquad v_2=\begin{bmatrix}3\\ 5\\ 2\end{bmatrix} $$ First, ${u}$ is a linear combination of $v_1$ and $v_2$ since $$ \begin{bmatrix} 1\\ 1 \\ 4\end{bmatrix} =(-2)\begin{bmatrix}1\\ 2\\ -1\end{bmatrix} +(1)\begin{bmatrix}3\\ 5\\ 2\end{bmatrix}. $$ To determine whether ${w}$ is a linear combination of ${v}_1$ and ${v}_2$ we consider the equation $$ {w}=x \begin{bmatrix} 1\\ 2\\ -1\end{bmatrix}+y\begin{bmatrix}3\\ 5\\ 2\end{bmatrix} $$ which leads to the system $$ \begin{cases} x+3y= 1 \\ 2x+5y =5 \\ -x+2y =1. \end{cases} $$ This linear system can be shown to have an empty solution set; and thus ${w}$ is not a linear combination of $v_1$ and $v_2.$

Example. For which values of the constant $c$ is $\begin{bmatrix}1\\ c\\ c^2\end{bmatrix}$ a linear combination of $\begin{bmatrix}1\\ a\\ a^2 \end{bmatrix}$ and $\begin{bmatrix}1\\ b\\ b^2\end{bmatrix}$, where $a$ and $b$ are arbitrary constants? We need to solve the following linear system $$ \begin{bmatrix} 1\\ c\\ c^2\end{bmatrix} =x\begin{bmatrix}1\\ a\\ a^2\end{bmatrix}+y\begin{bmatrix}1\\ b^2\end{bmatrix} \quad \text{ with augmented matrix} \quad \begin{bmatrix} \begin{array}{cc|c} 1 & 1 & 1 \\ a & b & c \\ a^2 & b^2 & c^2 \end{array} \end{bmatrix}. $$ using row operations the augmented matrix reduces to $$ \begin{bmatrix} \begin{array}{cc|c} 1 & 1 & 1 \\ 0 & b-a & c-a \\ 0 & 0 & (c-a)(c-b) \end{array} \end{bmatrix}. $$ This system is consistent if and only if $c=a$ or $c=b.$ Thus the vector is a linear combination if $c=a$ or $c=b.$

Example. Express the vector $\begin{bmatrix} 7 \\ 11 \end{bmatrix}$ as the sum of a vector on the line $y=3x$ and a vector on the line $y=x/2.$ We wish to find $x_1$ and $x_2$ such that $$ \begin{bmatrix} 7 \\ 11 \end{bmatrix}= \begin{bmatrix} x_1 \\ 3x_1 \end{bmatrix}+ \begin{bmatrix} x_2 \\ \frac{1}{2}x_2 \end{bmatrix}. $$ We solve this linear system using row-operations as follows. $$ \begin{bmatrix}1 & 1 & 7 \\ 3 & \frac{1}{2} & 11 \end{bmatrix} \stackrel{\longrightarrow}{\scriptstyle -3R_1+R_2} \begin{bmatrix}1 & 1 & 7\\ 0 & \frac{-5}{2} & -10 \end{bmatrix} \qquad \qquad \stackrel{\longrightarrow}{\scriptstyle(-2/5)R_2} \begin{bmatrix}1 & 1 & 7 \\ 0 & 1 & 4 \end{bmatrix}\stackrel{\longrightarrow}{\scriptstyle-R_2+R_1} \begin{bmatrix}1 & 0 & 7 \\ 0 & 1 & 4 \end{bmatrix} $$ The solution is $x_1=3$ and $x_2=4.$ Therefore the desired sum is $$ \begin{bmatrix} 7 \\ 11 \end{bmatrix}= \begin{bmatrix} 3 \\ 9 \end{bmatrix}+ \begin{bmatrix} 4 \\ 1 \end{bmatrix} $$ since $\begin{bmatrix} 3 \\ 9 \end{bmatrix}$ is on the line $y=3x$ and $\begin{bmatrix} 4 \\ 1 \end{bmatrix}$ is on the line $y=x/2.$

Theorem. Let $B$ be an $n\times p$ matrix and $A$ and $p \times m$ matrix with columns $\{v_1, v_2, \ldots, v_m\}.$ Then the product $B A$ is $$ B A = B \begin{bmatrix} v_1 & \cdots & v_m \end{bmatrix} \begin{bmatrix} B v_1 & \cdots & B v_m \end{bmatrix}. $$

Proof. Notice that $v_1$, $v_2,\ldots, v_m$ are all $p\times 1$ column vectors consisting of the columns of $A.$ So the product of $B$ and each $v_i$ ($0\leq i \leq m$) are the $n\times 1$ column vectors constituting $BA.$

Theorem. If the column vectors of an $n\times m$ matrix $A$ are ${v}_1,{v}_2,\ldots,{v}_n$ and ${x}$ is a vector with entries $x_1, x_2, \ldots, x_m$, then $$ A x =x_1 v_1 + x_2 v_2 + \cdots + x_m v_m . $$

Proof. The proof follows from the following equation:

\begin{align*} A x & =\begin{bmatrix} | & & | \\ v_1 & \cdots & v_m \\ | & & | \end{bmatrix}x = \begin{bmatrix} v_{11} & \cdots & v_{1m} \\ \vdots & \ddots & \vdots \\ v_{n1} & \cdots & v_{nm} \\ \end{bmatrix} \begin{bmatrix}x_1\\ x_2\\ \vdots\\ x_m\end{bmatrix} \\ & = \begin{bmatrix} v_{11} x_1 + v_{12} x_2 + \cdots + v_{1m}x_m \\ \vdots \\ v_{n1} x_1 + v_{n2} x_2 + \cdots + v_{nm} x_m \end{bmatrix}_{n\times 1} \\ & = x_1 v_1 + x_2 v_2 + \cdots + x_m v_m \end{align*}

which completes the proof.

Corollary. Let $A$ be an $n\times m$ matrix, $x$ and $y$ be $m\times 1$ column vectors, and let $k$ be a scalar. Then the following hold. (a) $A(x + y)=A(x) + A(y)$ (b) $A(k x )=k A(x)$.

Proof. Since the column vectors $x$ and $y$ are matrices and the matrices $A(x + y)$, $A(x) + A(y)$, $A(k x )$, and $k A(x)$ are defined we can apply Properties of Matrix Multiplication to obtain $$ A(x + y)=A(x) + A(y) $$ and $A(k x )=k A(x).$

Exercises on Matrices and Vectors

Exercise. Let $A$ be a matrix of size $4\times 5$, let $B$ be a matrix of size $6\times 4$, let $C$ be a matrix of size $4\times 5$, and let $D$ be a matrix of size $4\times 2.$ Which of the following are defined, and for those that are, what is their size?

  • $BA$
  • $DA$
  • $B+A$
  • $C+A$
  • $C(AB)$
  • $(BA)C$
  • $C+DB$
  • $D(C+A)$
  • $A+CB$

Exercise. Let $ A= \begin{bmatrix} 2 & 1 & 3 \\ 4 & 0 & -1 \\ -1 & 1 & 0 \end{bmatrix} $ and $B= \begin{bmatrix} 1 & -1 & 1 \\ 0 & 2 & 0 \\ 4 & 3 & 2 \end{bmatrix}.$ Where possible, find the following matrices.

  • $A+B$
  • $B-A$
  • $3B$
  • $-2A$
  • $AB-BA$
  • $-2BA$
  • $2A-4B$
  • $B^2-A^2$

Exercise. Let $A=\begin{bmatrix}2 & 1 \\ 0 & -1\end{bmatrix}$, $B=\begin{bmatrix}3 & -1 & 2\\ 0 & 1 & 4\end{bmatrix}$, $C=\begin{bmatrix}3 & -1 \\ 2 & 0\end{bmatrix}$, and $D=\begin{bmatrix}1 & 3 \\ -1 & 0 \\ 1 & 4\end{bmatrix}.$ Where possible, find the following

  • $3A-2B$
  • $5C$
  • $B+D$
  • $2B-3D$
  • $A-D$
  • $A-2C$

Exercise. Find $A$ in terms of $B.$ (a) $A+B=3A+2B$ (b) $2A-B=5(A+2B)$.

Exercise. Simplify the following expressions where $A$, $B$, and $C$ are matrices.
(a) $2[9(A-B)+7(2B-A)]-2[3(2B+A)-2(A+3B)]$
(b) $5[3(A-B+2C)-2(3C-B)-A]+2[3(3A-B+C)+2(B-2A)-2C]$.

Exercise. Show that if $Q+A=A$ holds for every $m\times n$ matrix $A$, then
$Q=0_{mn}.$

Exercise. Show that if $A$ is an $m\times n$ matrix with $A+A’=0_{mn}$, then $A’=-A.$

Exercise. Show that if $A$ denotes an $m\times n$ matrix, then $A=-A$ if and only if $A=0.$

Exercise. Given the following vectors, determine the values of $a$ and $b$ such that
${u}$ is a linear combination of ${v_1}$ and ${v_2}.$ $$ {u}=\begin{bmatrix}5\\ 7\\ a\\ b\end{bmatrix} \qquad {v_1}=\begin{bmatrix}1\\ 1\\1 \\ 1\end{bmatrix} \qquad {v_2}=\begin{bmatrix}4\\ 3\\ 2\\ 1\end{bmatrix} $$

Exercise. Find all solutions $x_1, x_2, x_3$ of the equation $$ b = x_1 v_1 + x_2 v_2 + x_3 v_3 $$ given the following vectors. $$ b = \begin{bmatrix}-8\\ -1\\ 2\\ 15\end{bmatrix} \qquad v_1 = \begin{bmatrix}1\\ 4\\ 7\\ 5\end{bmatrix} \qquad v_2 = \begin{bmatrix}2\\ 5\\ 8\\ 3\end{bmatrix} \qquad v_3 = \begin{bmatrix}4 \\ 6\\ 9\\ 1\end{bmatrix} $$

Exercise. Determine the value of $a$ such that ${u}$ is a linear combination of ${v_1}$ and ${v_2}$ given the following vectors. $$ {u}=\begin{bmatrix}1\\ a\\ a^2\end{bmatrix} \qquad {v_1}=\begin{bmatrix}1\\ 2\\ 4\end{bmatrix} \qquad {v_2}=\begin{bmatrix}1\\ 3\\ 9\end{bmatrix} $$

Exercise. Find the product of the given matrices.

  • $\begin{bmatrix} 1 & -1 & 2 \\ 2 & 0 & 4 \end{bmatrix} \begin{bmatrix} 2 & 3 & 1 \\ 1 & 9 & 7\\ -1 & 0 & 2 \end{bmatrix}$
  • $\begin{bmatrix} 1 & 3 & -3 \end{bmatrix} \begin{bmatrix} 3 & 0 \\ -2 & 1 \\ 0 & 6 \end{bmatrix}$

Exercise. Let $A$, $B$, $C$ be the the matrices $$ A=\begin{bmatrix}2 & 3-i \\ 1 & i \end{bmatrix} \qquad B=\begin{bmatrix}i & 1-1 \\ 0 & i\end{bmatrix} \qquad C=\begin{bmatrix} 2+1 & 1 \\ 3 & i+1 \end{bmatrix}. $$ Find each of the following.

  • $A+B+C$
  • $AB+AC$
  • $A+BC$
  • $CB$
  • $A^2 C$
  • $C^2 A$

Exercise. Find all matrices that commute with the given matrix.

  • $\begin{bmatrix}0 & 1 \\ 0 & 0 \end{bmatrix} $
  • $\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix} $
  • $ \begin{bmatrix} 2 & 3 \\ 1 & 0 \end{bmatrix} $

Exercise. For what values of the constants $a, b, c$ and $d$ is $b$ a linear combination of the vectors $v_1$, $v_2$, $v_3$ given $$ b = \begin{bmatrix} a \\ b \\ c \\ d \end{bmatrix} \qquad v_1 =\begin{bmatrix}0 \\ 0\\ 3\\ 0\end{bmatrix} \qquad v_2 = \begin{bmatrix}1\\ 0\\ 4\\ 0\end{bmatrix} \qquad v_3 = \begin{bmatrix}2\\ 0\\ 5\\ 6\end{bmatrix}? $$

Exercise. If $ A= \begin{bmatrix} a & b \\ c & d \end{bmatrix} $ where $a\neq 0$, show that $A$ factors in the form $$ A= \begin{bmatrix} 1 & 0 \\ x & 1 \end{bmatrix} \begin{bmatrix} y & z \\ 0 & w \end{bmatrix}. $$

Exercise. Find all vectors in $\mathbb{R}^4$ that are perpendicular to the following three vectors. $$ u=\begin{bmatrix}1\\ 1\\ 1\\ 1\end{bmatrix} \qquad v=\begin{bmatrix}1\\ 2\\ 3\\ 4\end{bmatrix} \qquad w=\begin{bmatrix}1\\ 9\\ 9\\ 7\end{bmatrix} $$

Exercise. Determine a scalar $t$ such that $A X=t X$ where $ A= \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix} $ and $ X= \begin{bmatrix} 1 \\ 1\end{bmatrix}. $

Exercise. Show that if $A$ and $B$ commute with $C$, then so does $A+B.$

Exercise. Show that if $A$ and $B$ are $n\times n$ matrices, then $A B=BA$ if and only if $$(A+B)^2=A^2+2A B+B^2. $$

Exercise. Let $A$ and $B$ be the $2\times 2$ matrices

$$ A= \begin{bmatrix} \begin{bmatrix} 1 & 3 \\ 2& 1 \end{bmatrix} & \begin{bmatrix} 0 & 3 \\ 1 & 3 \end{bmatrix} \\ & \\ \begin{bmatrix} 3 & 4 \\ 1 & 1 \end{bmatrix} & \begin{bmatrix} 2 & 0 \\ 0 & 1 \end{bmatrix} \end{bmatrix} \quad B= \begin{bmatrix} \begin{bmatrix} 3 & 3 \\ 1 & 1 \end{bmatrix} & \begin{bmatrix} 2 & 6 \\ 4 & 3 \end{bmatrix} \\ & \\ \begin{bmatrix} 1 & 3 \\ 1 & 3 \end{bmatrix} & \begin{bmatrix} 2 & 1 \\ 1 & 1 \end{bmatrix} \end{bmatrix} $$

whose entries are themselves $2\times 2$ matrices. Where possible, find each of the following.

  • $A+B$
  • $-2B$
  • $3A$
  • $2A-3B$
  • $B^2$
  • $AB$
  • $BA$
  • $AB-BA$
  • $A^2-B^2$

Exercise. Let $A=\begin{bmatrix}\cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix}.$ Find an expression for

  • $A^2$,
  • $A^3$, and
  • $A^n$ where $n$ is a positive integer.

Exercise. The Pauli spin matrices $$ P_1 = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}, \quad P_2 = \begin{bmatrix} 0 & -i \\ i & 0 \end{bmatrix}, \quad P_3 = \begin{bmatrix} 1 & 0 \\ 0 &-1
\end{bmatrix} $$ are used when studying electron spin. Find

  • $P_1 P_2, $
  • $P_2 P_1$,
  • $P_2^2$,
  • $P_1 P_3$,
  • $P_3 P_1$, and
  • $P_1+i P_2.$

Exercise. Let $A$ denote an arbitrary matrix and let $k$ denote an arbitrary scalar. Prove the following properties hold.

  • $A-A=0$
  • $0A=0$
  • $0A=0$
  • $k I A=k A$

Exercise. Find $Ae_1$, $Ae_2$, and $Ae_3$ given the following.

$$ e_1 = \begin{bmatrix}1\\ 0\\ 0\end{bmatrix} \quad e_2 = \begin{bmatrix}0\\ 1\\ 0\end{bmatrix} \quad e_3=\begin{bmatrix}0 \\ 0 \\ 1\end{bmatrix} \qquad \text{and} \qquad A= \begin{bmatrix} a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \\ c_1 & c_2 & c_3 \end{bmatrix} $$

Exercise. Find a $3\times 3$ matrix $A$ that satisfies all of the following.

$$ A \begin{bmatrix}0\\ 1\\ 0\end{bmatrix} = \begin{bmatrix}4\\ 5\\ 6\end{bmatrix} \qquad \text{and} \qquad A \begin{bmatrix}0\\ 0\\ 1\end{bmatrix} = \begin{bmatrix}7\\ 8 \\9\end{bmatrix} $$

Exercise. Find all vectors $x$ such that $Ax=b$ given the following.

$$ A= \begin{bmatrix} 1 & 2 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \quad \text{ and } \quad b=\begin{bmatrix} 2 \\ 1 \\ 0\end{bmatrix}. $$

Exercise. Write the system $$ \begin{cases} 2x_1-3x_2+5x_3 =7 \\ 9x_1+4x_2-6x_3 =8 \end{cases} $$ in matrix form and write $\begin{bmatrix} 7 \\ 8 \end{bmatrix}$ as a linear combination of the columns vectors of the coefficient matrix.

Exercise. Show that the sum of any two diagonal matrices is diagonal.

Exercise. Show that the sum of any two upper triangular matrices is upper triangular.

Exercise. Let $A$ be an $m\times n$ matrix and let $$ B=\begin{bmatrix}b_1\\ b_2\\ \vdots\\ b_n\end{bmatrix} \qquad \text{and} \qquad C=\begin{bmatrix} c_1 & c_2 & \cdots & c_m\end{bmatrix} $$ be a column vector and a row vector, respectively. Prove that $$ AB=\sum_{j=1}^n b_j A_j \qquad \text{and} \qquad CA=\sum_{j=1}^n c_j A_j. $$

Exercise. Let $A$ and $B$ denote $n\times n$ matrices. The Jordan product, $A\star B$ is defined by $$ A\star B=\frac{1}{2}(AB+BA). $$ Determine whether this product is commutative, associative, and/or distributive.

David A. Smith at Dave4Math

David Smith (Dave) has a B.S. and M.S. in Mathematics and has enjoyed teaching precalculus, calculus, linear algebra, and number theory at both the junior college and university levels for over 20 years. David is the founder and CEO of Dave4Math.

Leave a Comment