Subspaces and Linear Independence

In this article, I give an elementary introduction to subspaces. After that, I motivate the concepts of linear independence, spanning set, and basis. Then, I prove that a basis of the subspace gives a unique representation. Towards the end, I explore the dimension of a subspace. You will find many examples and exercises.

Definition. A subset $U$ of a vector space $\mathbb{V}$ is called a subspace of $\mathbb{V}$ if it has the following three properties

(1) $U$ contains the zero vector in $\mathbb{V}$,

(1) $U$ is closed under addition: if ${u}$ and ${v}$ are in $U$ then so is ${u}+{v}$, and

(3) $U$ is closed under scalar multiplication: if ${v}$ is in $U$ and $a$ is any scalar, then $a{v}$ is in $U.$

Example. Let $U$ be the subset of $\mathbb{R}^5$ defined by $$ U=\{(x_1,x_2,x_3,x_4,x_5)\in\mathbb{R}^5 \, \mid \,x_1=3x_2 \text{ and } x_3=7x_4\}. $$ Show $U$ is a subspace of $\mathbb{R}^5.$

The zero vector ${0}=(0,0,0,0,0,0)$ is in $U$ since $0=3(0)$ and $0=7(0).$ Let ${u}=(u_1, u_2, u_3, u_4, u_5)$ and ${v}=(v_1, v_2, v_3, v_4, v_5)$ be vectors in $U$ and let $a$ be a scalar. Then ${u}+{v}$ is in $U$ since $$ u_1=3 u_2 \text{ and } v_1=3 v_2 \text{ imply } u_1+v_1=3 (u_2+v_2)$$ and $$ u_3=7 u_4 \text{ and } v_3=7 v_4 \text{ imply } u_3+v_3=7 (u_4+v_4).$$ Also, $a {u}$ is in $U$ since $u_1=3u_2$ implies $a u_1=3 (a u_2).$ Hence we have, $U$ is a subspace of $\mathbb{R}^5.$

Example. Give an example of a nonempty subset $U$ of $\mathbb{R}^2$ such that $U$ is closed under addition and under taking additive inverses, but $U$ is not a subspace of $\mathbb{R}^2.$ The subset $\mathbb{Z}^2$ of $\mathbb{R}^2$ is closed under additive inverses and addition, however $\mathbb{Z}^2$ is not a subspace of $\mathbb{R}^2$ since $\sqrt{2} \in \mathbb{R}, (1,1)\in \mathbb{Z}^2$ however $(\sqrt{2} , \sqrt{2}) \not \in \mathbb{Z}^2.$

Example. Give an example of a nonempty subset $U$ of $\mathbb{R}^2$ such that $U$ is closed under scalar multiplication, but $U$ is not a subspace of $\mathbb{R}^2.$ The set $\{(x_1,x_2)\in \mathbb{R}^2 \mid x_1 x_2=0\}=M$ is closed under scalar multiplication because, if $\lambda\in \mathbb{R}$ and $(x_1,x_2)\in M$, then $(\lambda x_1, \lambda x_2)\in M$ holds since $\lambda x_1 \lambda x_2=0.$ However, $M$ is not a subspace because $(0,1)+(1,0)=(1,1)\not \in M$ even though $(0,1),(1,0)\in M.$

Example. Show that the set of all solutions of an $m\times n$ homogenous linear system of equations is a subspace of $\mathbb{V}$ (called the null space).

Let $A{x}={0}$ be an $m\times n$ homogenous system of linear equations and let $U$ be the set of solutions to this system. Of course $A{0}={0}$ and so the zero vector is in $U.$ Let ${u}$ and ${v}$ be in $U$ and let $a$ be a scalar. Then $$A({u}+{v})=A{u}+A{v} ={0}+{0}={0}$$ and $$ A(a {u})=a(A{u})=a{0}={0} $$ shows ${u}+{v}$ and $a{u}$ are in $U.$ Hence, $U$ is a subspace of $\mathbb{V}.$

Definition. Let ${v}_1, {v}_2, \ldots, {v}_m$ be vectors in the vector space $\mathbb{V}.$ The set of all linear combinations $$ \mathop{span}(v_1, v_2, \ldots, v_m ) =\left\{ c_1 v_1 + c_2 v_2 + \cdots + c_m v_m \mid c_1, c_2, \ldots, c_m\in k \right\}$$ is called the spanning set of the vectors $v_1, v_2, \ldots, v_m .$

Example. Show that the spanning set of the vectors $v_1, v_2, \ldots, v_m $ in $\mathbb{V}$ is a subspace of $\mathbb{V}.$ Let $U=\mathop{span}(v_1, v_2, \ldots, v_m ).$ Notice ${0}\in U$ since ${0}=0 v_1 + 0 v_2 + \cdots + 0 v_m $ where $0\in k.$ Let ${u}$ and ${v}$ be vectors in $U$ and let $a$ be a scalar. From above, there exists scalars $c_1, c_2, \ldots, c_m$ and scalars $d_1, d_2, \ldots, d_m$ such that $$ {u} =c_1 v_1 + c_2 v_2 + \cdots + c_m v_m \quad \text{ and } \quad {v} =d_1 v_1 + d_2 v_2 + \cdots + d_m v_m $$ Then $$ {u}+{v}=\sum_{i=1}^m c_i {v}_i + \sum{i=1}^m d_i {v}_i =\sum{i=1}^m (c_i+d_i) {v}_i $$ and $$ a{u}=a\left(\sum{i=1}^m c_i {v}_i\right)= \sum{i=1}^m (a c_i) {v}_i
$$ show ${u}+{v}$ and $a{u}$ are in $U$; and thus $U$ is a subspace of $\mathbb{V}.$

We say that a nonzero ${v}_i$ is redundant in the list ${v}_1 , \ldots, {v}_i, \ldots, {v}_m$ if $v_i$ can be written as a linear combination of the other nonzero vectors in the list. An equation of the form $a_1 {v}_1 + \cdots +a_m {v}_m = {0}$ is called a linear relation among the vectors ${v}_1 , \ldots, {v}_m$; and is called a nontrivial relation if at least one of the $a_i$’s is nonzero.

Definition. The vectors $v_1, v_2, \ldots, v_m $ in $\mathbb{V}$ are called linear independent if the only choice for $ a_1 v_1 + a_2 v_2 + \cdots + a_m v_m = {0} $ is $a_1 = a_2=\cdots =a_m = 0.$ Otherwise the vectors $v_1, v_2, \ldots, v_m $ are called linear dependent.

Lemma. Show that any set $S={v_1, v_2, \ldots, v_m }$ of vectors in $\mathbb{V}$ is a linearly dependent set of vectors if and only if at least one of the vectors in the set can be written as a linear combination of the others.

Proof. Assume the vectors in the set $S$ are linearly dependent. There exists scalars $c_1, c_2, \ldots, c_m$ (not all zero) such that $c_1 v_1 + c_2 v_2 + \cdots + c_m v_m ={0}.$ Let $i$ be the least index such that $c_i$ is nonzero. Thus $c_1=c_2=\cdots =c_{i-1}=0.$ So $c_i {v}_i=-c_{i+1}{v}_{i+1}-\cdots -c_m {v}_m$ for some $i.$ Since $c_i\neq 0$ and $c_i\in k$, $c_i^{-1}$ exists and thus $${v}_i =\left(\frac{-c{i+1}}{c_i}\right){v}_{i+1} \cdots + \left(\frac{-c_m}{c_i}\right){v}_m $$ which shows ${v}_i$ is a linear combination of the others, via $$ {v}_i= 0{v}_1+\cdots+0{v}{i-1}+\left(\frac{-c_{i+1}}{c_i}\right){v}_{i+1} \cdots + \left(\frac{-c_m}{c_i}\right){v}_m. $$ Now assume one of the vectors in the set $S$ can be written as a linear combination of the others, say $$ {v}_k =c_1 {v}_1+\cdots +c_{k-1}{v}_{k-1} +c_{k+1} {v}_{k+1}
+\cdots c_m {v}_m $$ where $c_1, c_2, \ldots, c_m$ are scalars. Thus, $$
{0}=c_1 {v}_1+\cdots c_{k-1} {v}_{k-1} +(-1) {v}_k +c_{k+1} {v}_{k+1} +\cdots + c_m {v}_m $$ and so, ${v}_1, \ldots, {v}_m$ are linearly dependent.

For a list of vectors $v_1, v_2, \ldots, v_m $ in $\mathbb{V}$ the following equivalent statements follow from the appropriate definitions:

  • vectors $v_1, v_2, \ldots, v_m $ are linearly independent,
  • none of the vectors $v_1, v_2, \ldots, v_m $ are redundant,
  • none of the vectors $v_1, v_2, \ldots, v_m $ can be written as a linear combination of the other vectors in the list,
  • there is only the trivial relation among the vectors $v_1, v_2, \ldots, v_m $,
  • the only solution to the equation $a_1 v_1 + a_2 v_2 + \cdots + a_m v_m =0$ is $a_1 = a_2=\cdots =a_m= 0$, and
  • $\mathop{rank}(A)=n$

where $A$ is the $n\times m$ matrix whose columns are the vectors $v_1, v_2, \ldots, v_m .$

Example. Determine whether the following vectors ${u}$, ${v}$, and ${w}$ are linearly independent. $$ {u}=\begin{bmatrix}1 \\ 1 \\ 1 \\ 1\end{bmatrix} \qquad {v}=\begin{bmatrix}1 \\ 2 \\ 3 \\ 4\end{bmatrix} \qquad {w}=\begin{bmatrix}1 \\ 4 \\ 7 \\ 10\end{bmatrix} $$ Without interchanging rows, we use elementary row operations to find $$ \mathop{rref} \begin{bmatrix} {u} & {v} & {w} \end{bmatrix} =\begin{bmatrix}1 & 0 & -2 \\ 0 & 1 & 3 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}. $$ From this we infer the nontrivial relation $0=(-2){u}+(3){v}+(-1){w}.$ Therefore the given vectors are linearly dependent.

Theorem. Let $$ {v}_1=\begin{bmatrix}a_{11} \\ a_{21} \\ \vdots \\ a_{n1}\end{bmatrix} \quad {v}_2=\begin{bmatrix}a_{12} \\ a_{22} \\ \vdots \\ a_{n2}\end{bmatrix} \quad \cdots \quad {v}_m =\begin{bmatrix}a_{1s} \\ a_{2m} \\ \vdots \\ a_{nm}\end{bmatrix} $$ be $s$ vectors in $\mathbb{V}.$ These vectors are linearly dependent if and only if there exists a solution to the system of linear equations \begin{equation*} \label{lincomsys} \begin{cases} a_{11}x_1+a_{12}x_2+\cdots+a_{1m}x_m=0 \\ a_{21}x_1+a_{22}x_2+\cdots+a_{2m}x_m=0 \\ \qquad \qquad \vdots \\ a_{n1}x_1+a_{n2}x_2+\cdots+a_{nm}x_m=0 \\ \end{cases} \end{equation*} different from $x_1=x_2=\cdots=x_m=0.$

Proof. Assume $v_1, v_2, \ldots, v_m $ are linear dependent. There exists scalars $c_1, c_2, \ldots, c_m$ such that \begin{equation} \label{lincomeq} c_1 v_1 + c_2 v_2 + \cdots + c_m v_m ={0} \end{equation} and not all $c_i$’s are zero. Thus we have a system $$ \begin{cases} a_{11}c_1+a_{12}c_2+\cdots+a_{1m}c_m=0 \\ a_{21}c_1+a_{22}c_2+\cdots+a_{2m}c_m=0 \\ \qquad \qquad \vdots \\ a_{n1}c_1+a_{n2}c_2+\cdots+a_{nm}c_m=0 \\ \end{cases} $$ with solution $x_1=c_1$, $x_2=c_2, \ldots, x_m=c_m.$ Since not all $c_i$’s are zero we have a solution different from $x_1=x_2=\cdots=x_m=0.$ Assume the system has a solution ${x}$ with ${x} \neq {0}$, say $x_i.$ We can write \begin{equation*} {0}=A{x}= x_1 v_1 + x_2 v_2 + \cdots + x_m v_m \end{equation*} Isolating the term $x_i {v}_i$ yields \begin{equation*} x_i {v}_i=-x_i {v}_1-\cdots -x_{i-1}{v}_{i-1}-x_{i+1}{v}_{i+1} -\cdots -x_m {v}_m \end{equation*} and since $x_i \neq 0$, $(x_i)^{-1}$ exists. Therefore, \begin{equation*} {v}_1=\left(-\frac{x_1}{x_i}\right){v}_1-\cdots -\left(-\frac{x_{i-1}}{x_i}\right){v}_{i+1}-\cdots – \left(-\frac{x_{m}}{x_i}\right){v}_{m} \end{equation*} shows the vectors $v_1, v_2, \ldots, v_m $ are linearly dependent.

Theorem. The $n\times m$ linear system of equations $A{x}={b}$ has a solution if and only if the vector ${b}$ is contained in the subspace of $\mathbb{V}$ generated by the column vectors of $A.$

Proof. Let ${x}$ be a solution to $A{x}={b}$ with $A=\begin{bmatrix}{v}_1 & {v}_2 & \cdots & {v}_m \end{bmatrix}.$ Thus ${b}=A{x}=x_1 v_1 + x_2 v_2 + \cdots + x_m v_m $ and thus ${b}\in \mathop{span}(v_1, v_2, \ldots, v_m )$ as needed. Conversely, assume ${b}$ is in the subspace generated by the column vectors of $A$; that is assume ${b}\in \mathop{span}(v_1, v_2, \ldots, v_m ).$ There exists scalars $c_1, c_2, \ldots, c_m$ such that ${b}=c_1 v_1 + c_2 v_2 + \cdots + c_m v_m .$ Thus, ${b}=c_1 v_1 + c_2 v_2 + \cdots + c_m v_m =A{c}$ where the components of ${c}$ are the $c_i$’s. Thus the system $A{x}={b}$ has a solution, namely ${c}.$

Example. Let $U$ and $\mathbb{V}$ be finite subsets of a vector space $\mathbb{V}$ with $U\subseteq V.$

  • If $U$ is linear dependent, then so is $\mathbb{V}.$
  • If $\mathbb{V}$ is linear independent, then so is $U.$

Let $U=\{u_1, u_2, \ldots, u_s\}$ and $V=\{v_1, v_2, \ldots, v_t\}.$

  • If $U$ is linear dependent, then thee exists a vector, say ${u}_k$ such that ${u}_k$ is a linear combination of the other ${u}_i$’s. Since $U\subseteq V$ all ${u}_i$’s are in $\mathbb{V}.$ Thus we have a vector ${u}_k$ in $\mathbb{V}$ that is a linear combination of other vectors in $\mathbb{V}.$ Therefore, $\mathbb{V}$ is linear dependent.
  • Let $c_1, c_2, \ldots, c_s$ be scalars such that \begin{equation} \label{lincombcus} c_1 u_1 + c_2 u_2 + \cdots + c_s u_s ={0}. \end{equation} Since $U\subseteq V$, we know $u_i\in V$ for $1\leq i \leq s.$ Since $\mathbb{V}$ is linear independent, $c_1=c_2=\cdots =c_m=0.$ Thus $U$ is linear independent as well.

Corollary. Any vector in $\mathbb{V}$, written as a column matrix, can be expressed (uniquely) as a linear combination of ${v}_1, {v}_2, \ldots, {v}_m$ if and only if $A {x}={v}$ has unique solution, where $A=\begin{bmatrix}{v}_1 & {v}_2 & \cdots & {v}_m\end{bmatrix}.$ When there is a solution, the components $x_1, x_2, \ldots, x_m$ of ${x}$ give the coefficients for the linear combination.

Theorem. The vectors $v_1, v_2, \ldots, v_n $ in $\mathbb{V}$ form a linearly independent set of vectors if and only if $\begin{bmatrix}{v}_1& {v}_2 & \cdots & {v}_n \end{bmatrix}$ is row equivalent to $I_n.$

Theorem. Let $\mathbb{V}$ be a vector space and assume that the vectors $v_1, v_2, \ldots, v_n $ are linearly independent and $\mathop{span}(s_1, s_2, \ldots, s_m )=V.$ Then $n\leq m.$

Proof. We are given $$ \mathop{span}(s_1, s_2, \ldots, s_m )=V
\quad \text{and} \quad v_1, v_2, \ldots, v_n \text{ are linearly independent.} $$ Since ${v}_1$ is a linear combination of the vectors ${s}_1$, ${s}_2$, \ldots, ${s}_m$ we obtain $$ \mathop{span}({v}_1,{s}_2,\ldots,{s}_m)=V \quad \text{and} \quad {v}_2, \ldots, {v}_n \text{ are linearly independent,} $$ respectively. Since ${v}_2$ is a linear combination of ${v}_1$, ${s}_2, \ldots, {s}_m$ we obtain $$ \mathop{span}({v}_1,{v}_2,{s}_3,\ldots,{s}_m)=V \quad \text{and} \quad {v}_3, \ldots, {v}_n \text{ are linearly independent,} $$ respectively. Now if $$v_{m+1}, \ldots, {v}_n \text{ are linearly independent.} $$ This is a contradiction since ${v}_n$ is not in $\mathop{span}(v_1, v_2, \ldots, v_m)$; and whence $n\leq m.$

Theorem. A set $S=\{v_1, v_2, \ldots, v_m \}$ of vectors in $\mathbb{V}$ is linearly independent if and only if for any vector ${u}$, if ${u} = u_1 v_1 + u_2 v_2 + \cdots + u_m v_m $, then this representation is unique.

Proof. Assume the vectors in $S$ are linearly independent and assume ${u}$ is an arbitrary vector with \begin{equation*} {u}=a_1 v_1 + a_2 v_2 + \cdots + a_m v_m \qquad \text{and} \qquad {u}=b_1 v_1 + b_2 v_2 + \cdots + b_m v_m \end{equation*} as both representations of ${u}$ as linear combinations of the vectors in $S.$ Then $$ {0}={u}-{u} =(a_1-b_1){v}_1+(a_2-b_2){v}_2+\cdots +(a_m-b_m){v}_m. $$ Since $S$ is linearly independent $a_1-b_1=a_2-b_2=\cdots =a_m-b_m=0$ and thus $a_1=b_1$, $a_2=b_2$, \ldots, $a_m=b_m.$ Therefore, the representation of ${u}$ as a linear combination of the vectors in $S$ is unique. Conversely, assume for nay vector ${u}$ which can be written as a linear combination of the vectors in $S$, the representation is unique. If $c_1, c_2, \ldots, c_m$ are scalars such that $c_1 v_1 + c_2 v_2 + \cdots + c_m v_m ={0}$ then $c_1=c_2=\cdots =c_m=0$ much hold since $0{v}_1+0{v}_2+\cdots +0{v}_m={0}$ and this representation is unique. Therefore, the vectors in $S$ are linearly independent.

Definition. The vectors $v_1, v_2, \ldots, v_m $ in $\mathbb{V}$ are called a basis of a linear subspace $\mathbb{V}$ if they span $\mathbb{V}$ and are linearly independent.

Example. Find a basis for $\mathbb{V}$ for $n=1,2,3,\ldots,m.$ For $n=2$, the vectors $\begin{bmatrix}1 \\ 0\end{bmatrix}$, $\begin{bmatrix}0 \\ 1\end{bmatrix}$ form a basis for $k^2.$ For $n=3$, the vectors $\begin{bmatrix}1 \\ 0 \\ 0\end{bmatrix}$, $\begin{bmatrix}0 \\ 1 \\ 0\end{bmatrix}$, $\begin{bmatrix}0 \\ 0 \\ 1\end{bmatrix}$ form a basis for $k^3.$ In general, for a positive integer $n$, the following $n$ vectors of $\mathbb{V}$ form a basis (called the standard basis) of $\mathbb{V}.$ \begin{equation} \label{stba} {e}_1=\begin{bmatrix}1 \\ 0 \\ \vdots \\ 0\end{bmatrix} \qquad {e}_2=\begin{bmatrix}0 \\ 1 \\ \vdots \\ 0\end{bmatrix} \qquad \cdots \qquad {e}_n=\begin{bmatrix}0 \\ 0 \\ \vdots \\ 1\end{bmatrix} \end{equation} The vectors in a standard basis are linearly independent. Given any vector ${v}$ in $\mathbb{V}$ with components $v_i$, we can write $$ {v}=v_1 e_1 + v_2 e_2 + \cdots + v_n e_n , $$ and thus $k^n=\mathop{span}(e_1, e_2, \ldots, e_n )$ which shows that any standard basis is in fact a basis.

Example. Show the following vectors ${v}_1$, ${v}_2$, ${v}_3$, and ${v}_4$ form a basis for $\mathbb{R}^4.$ $$ v_1=\begin{bmatrix} 1 \\ 1\\ 1 \\ 1 \end{bmatrix} \qquad v_2=\begin{bmatrix} 1 \\ -1\\ 1 \\ -1 \end{bmatrix} \qquad v_3=\begin{bmatrix} 1 \\ 2 \\ 4 \\ 8 \end{bmatrix}\qquad v_4=\begin{bmatrix} 1 \\ -2 \\ 4 \\ -8 \end{bmatrix} $$ We determine $\mathop{rref}(A)=I_4$ where $A$ is the matrix with column vectors $v_1, v_2, v_3, v_4.$ Thus we have, $v_1, v_2, v_3, v_4$ are linearly independent. Since $v_1, v_2, v_3, v_4$ also span $\mathbb{R}^4$, they form a basis of $\mathbb{R}^4.$

Example. Let $U$ be the subspace of $\mathbb{R}^5$ defined by $$
U=\{(x_1,x_2,x_3,x_4,x_5)\in\mathbb{R}^5 \mid x_1=3x_2 \text{ and } x_3=7x_4\}. $$ Find a basis of $U.$ The following vectors belong to $U$ and are linearly independent in $\mathbb{R}^5.$ $$ v_1=\begin{bmatrix}3 \\ 1 \\ 0 \\ 0 \\ 0\end{bmatrix} \qquad v_2=\begin{bmatrix} 0 \\ 0 \\ 7 \\ 1 \\ 0\end{bmatrix} \qquad v_3=\begin{bmatrix}0 \\ 0 \\ 0 \\ 0 \\ 1\end{bmatrix} $$ If $u\in U$, then the representation $$ u =\begin{bmatrix}u_1 \\ u_2 \\ u_3 \\ u_4 \\ u_5\end{bmatrix} =\begin{bmatrix}3u_2 \\ u_2 \\ 7u_4 \\ u_4 \\ u_5\end{bmatrix} =u_2\begin{bmatrix}3 \\ 1 \\ 0 \\ 0 \\ 0\end{bmatrix} + u_4\begin{bmatrix}0 \\ 0 \\ 7 \\ 1 \\ 0\end{bmatrix} + u_5\begin{bmatrix}0 \\ 0 \\ 0 \\ 0 \\ 1\end{bmatrix} $$ shows that they also span $U$, and thus form a basis of $U$.

Theorem. Let $S=\{v_1, v_2, \ldots, v_n \}$ be a set of vectors in a vector space $\mathbb{V}$ and let $W=\mathop{span}(S).$ Then some subset of $S$ is a basis for $W.$

Proof. Assume $W=\mathop{span}(S)$ and suppose $S$ is a linearly independent set of vectors. Thus, in this case, $S$ is a basis of $W$. So we can assume $S$ is a linearly dependent set of vectors. There exists $i$ such that $1\leq i \leq m$ and ${v}_i$ is a linear combination of the other vectors in $S.$ It is left for an exercise to show that $$ W=\mathop{span}(S)=\mathop{span}(S_1) $$ where $S_1=S/{{v}_i}.$ If $S_1$ is linearly independent set of vectors, then $S_1$ is a basis of $W.$ Otherwise, $S_1$ is a linear dependent set and we can delete a vector from $S_1$ that is a linear combination of the other vectors in $S_1.$ We obtain another subset $S_2$ of $S$ with $$ W=\mathop{span}(S)=\mathop{span}(S_1)=\mathop{span}(S_2). $$ Since $S$ is finite, if we continue, we find a linearly independent subset of $S$ and thus a basis of $W.$

Corollary. All bases of a subspace $U$ of a vector space $\mathbb{V}$ consists of the same number of vectors.

Proof. Let $S=\{v_1, v_2, \ldots, v_n\}$ and $T=\{w_1, w_2, \ldots, w_m\}$ be bases of a subspace $U.$ Then $\mathop{span}(S)=V$ and $T$ is a lineal independent set of vectors. Hence, $m\leq n.$ Similarly, since $\mathop{span}(T)=V$ and $S$ is a linearly independent set of vectors, $n\leq m.$ Therefore, $m=n$ as desired.

Corollary. The vectors ${v}_1, {v}_2, \ldots, {v}_n$ form a basis of $\mathbb{V}$ if and only if the reduced row echelon form of the $n\times n$ matrix $\begin{bmatrix}{v}_1& {v}_2 & \cdots & {v}_n \end{bmatrix}$ is $I_n.$

Proof. Suppose the vectors $v_1, v_2, \ldots, v_n $ form a basis of $\mathbb{V}$ and consider the $n\times n$ linear system $$ \begin{cases} v_{11} x_1+v_{12} x_2+\cdots +v_{1n} x_n=0 \\ v_{21} x_1+v_{22} x_2+\cdots +v_{2n} x_n=0 \\ \qquad \qquad \vdots \\ v_{n1} x_1+v_{n2} x_2+\cdots +v_{nn} x_n=0 \end{cases} $$ where the $v_{ij}$’s are the components of the ${v}_j$’s. Since ${v_1, v_2, \ldots, v_n }$ is a basis, ${v}_1={v}_2=\cdots ={v}_n={0}$ and this linear system can not have another solution. Therefore we have, $\mathop{rref}(A)=I_n$ where $A=\begin{bmatrix}{v}_1& {v}_2 & \cdots & {v}_n \end{bmatrix}.$

Definition. The number of vectors in a basis of a subspace $U$ of $\mathbb{V}$ is called the dimension of $U$, and is denoted by $\mathop{dim} U.$

Example. Find a basis of the subspace of $\mathbb{R}^4$ that consists of all vectors perpendicular to both of the following vectors ${v}_1$ and $v_2.$ $$ v_1=\begin{bmatrix}1 \\ 0 \\ -1 \\ 1\end{bmatrix} \qquad v_2=\begin{bmatrix}0 \\ 1 \\ 2 \\ 3\end{bmatrix} $$ We need to find all vectors $x$ in $\mathbb{R}^4$ such that $x \cdot v_1=0$ and $x \cdot v_2=0.$ We solve both $$ \begin{bmatrix}x_1 \\ x_2 \\ x_3 \\ x_4\end{bmatrix} \cdot \begin{bmatrix}1 \\ 0 \\ -1 \\ 1\end{bmatrix}=0 \qquad \text{and} \qquad \begin{bmatrix}x_1 \\ x_2 \\ x_3 \\ x_4\end{bmatrix} \cdot \begin{bmatrix}0 \\ 1 \\ 2 \\ 3\end{bmatrix}=0 $$ which leads to the system and matrix $$ \begin{cases} x_1-x_3+x_4 =0 \\ x_2+2x_3+3x_4 =0 \end{cases} \qquad \text{and} \qquad A= \begin{bmatrix} 1 & 0 & -1 & 1 \\ 0 & 1 & 2 & 3 \end{bmatrix}. $$ All solutions are given by $$ \begin{bmatrix}x_1 \\ x_2 \\ x_3 \ x_4\end{bmatrix} =t \begin{bmatrix}-1 \\ -3 \\ 0 \\ 1\end{bmatrix} + u\begin{bmatrix}1 \\ -2 \\ 1 \\ 0\end{bmatrix} \quad \text{ where $t, u\in\mathbb{R}$}. $$ It follows the vectors $\begin{bmatrix}-1 \\ -3 \\ 0 \\ 1\end{bmatrix}$, $\begin{bmatrix}1 \\ -2 \\ 1 \\ 0\end{bmatrix}$ form a basis of the desired subspace.

Theorem. Let $U$ be a subspace of $k^m$ with $\dim U=n$, then

(1) any list of linearly independent vectors contains $n$ elements,

(2) any list of vectors that spans $U$ contains at least $n$ elements,

(3) if $n$ vectors are linearly independent then they form a basis, and

(4) if $n$ vectors span $U$, then they form a basis of $U.$

Example. Determine the values of $a$ for which the following vectors ${u}_1$, ${u}_2$, ${u}_3$, and ${u}_4$ form a basis of $\mathbb{R}^4.$ $$ {u}_1=\begin{bmatrix}1 \\ 0 \\ 0 \\ 4\end{bmatrix} \qquad {u}_2=\begin{bmatrix}0 \\ 1 \\ 0 \\ 6\end{bmatrix} \qquad {u}_3=\begin{bmatrix}0 \\ 0 \\ 1 \\ 8\end{bmatrix} \qquad {u}_4=\begin{bmatrix}4 \\ 5 \\ 6 \\ a\end{bmatrix} $$ Let $A=\begin{bmatrix}{u}_1 & {u}_2 & {u}_3& {u}_4\end{bmatrix}.$ Using row operations we find the row-echelon form of $A$ to be the following matrix. $$ \begin{bmatrix} 1 & 0 & 0 & 4 \\ 0 & 1 & 0 & 5 \\ 0 & 0 & 1 & 6\\ 0 & 0 & 8 & a-94 \end{bmatrix} $$ Thus, $\mathop{rref}(A)=I_4$ if and only if $a=95.$ Therefore, $B={{u}_1, {u}_2, {u}_3, {u}_4}$ is a basis if and only if $a=95.$

Theorem. The dimension of the row space of a matrix $A$ is equal to the dimension of the column space of $A.$

Proof. The proof is left for the reader.

Read about Orthonormal Bases here.

Exercises on Subspaces

Exercise. Determine whether the following collection of vectors in $\mathbb{R}^3$ are linearly independent or linearly dependent.

  • $(0,1,1), (1,2,1), (0,4,6), (1,0,-1)$
  • $(0,1,0), (1,2,1), (0,-4,6), (-1,1,-1)$

Exercise. Determine whether the following collection of vectors in $\mathbb{R}^4$ are linearly independent or linearly dependent.

  • $(0,1,1,1), (1,2,1,1), (0,4,6,2), (1,0,-1, 2)$
  • $(0,1,0,1), (1,2,1,3), (0,-4,6,-2), (-1,1,-1, 2)$

Exercise. Show that the given vectors do not form a basis for the vector space $\mathbb{V}.$

  • $(21,-7), (-6, 1)$; $V=\mathbb{R}^2$
  • $(21,-7,14), (-6, 1,-4), (1,0,0)$; $V=\mathbb{R}^3$
  • $(48,24,108,-72), (-24, -12,-54,36), (1,0,0,0), (1,1,0,0)$; $V=\mathbb{R}^4$

Exercise. Reduce the vectors to a basis of the vector space $\mathbb{V}.$

  • $(1,0), (1,2), (2,4)$, $V=\mathbb{R}^2$
  • $(1,2,3), (-1, -10, 15), (1, 2, -3), (2,0,6), (1, -2, 3)$, $V=\mathbb{R}^3$

Exercise. Which of the following collection of vectors in $\mathbb{R}^3$ are linearly dependent? For those that are express one vector as a linear combination of the rest.

  • $(1,1,0), (0,2,3), (1,2,3)$
  • $(1,1,0), (3,4,2), (0,2,3)$

Exercise. Let $S=\{v_1, v_2, \ldots, v_k\}$ be a set of vectors in a a vector space $\mathbb{V}.$ Prove that $S$ is linearly dependent if and only if one of the vectors in $S$ is a linear combination of all other vectors in $S.$

Exercise. Suppose that $S=\{v_1, v_2, v_3\}$ is a linearly independent set of vector in a vector space $\mathbb{V}.$ Prove that $T=\{u_1, u_2, u_3\}$ is also linearly independent where $u_1=v_1$, $u_2=v_1+v_2$, and $u_3=v_1+v_2+v_3.$

Exercise. Which of the following sets of vectors form a basis for the vector space $\mathbb{V}.$

  • $(1,3), (1,-1)$; $V=\mathbb{R}^2$
  • $(1,3),(-2,6)$; $V=\mathbb{R}^2$
  • $(3,2,2), (-1,2,1), (0,1,0)$; $V=\mathbb{R}^3$
  • $(3,2,2), (-1,2,0), (1,1,0)$; $V=\mathbb{R}^3$
  • $(2,2,2,2), (3,3,3,2), (1,0,0,0), (0,1,0,0)$; $V=\mathbb{R}^4$
  • $(1,1,2,0), (2,2,4,0), (1,2,3,1), (2,1,3,-1), (1,2,3,-1) $; $V=\mathbb{R}^4$

Exercise. Find a basis for the subspace of the vector space $\mathbb{V}.$

  • All vectors of the form $(a,b,c)$ where $b=a+c$ where $V=\mathbb{R}^3.$
  • All vectors of the form $(a,b,c)$ where $b=a-c$ where $V=\mathbb{R}^3.$
  • All vectors of the form $\begin{bmatrix}b-a \ a+c \ b+c \ c\end{bmatrix}$ where $V=\mathbb{R}^4.$

Exercise. Let ${v}_1=\begin{bmatrix}0 \ 1 \ 1\end{bmatrix}$, ${v}_2=\begin{bmatrix}1 \ 0 \ 0\end{bmatrix}$ and $S=\mathop{span}({v}_1,{v}_2).$

  • Is $S$ a subspace of $\mathbb{R}^3$?
  • Find a vector ${u}$ in $S$ other than ${v}_1$, ${v}_2.$
  • Find scalars which verify that $3{u}$ is in $S.$
  • Find scalars which verify that ${0}$ is in $S.$

Exercise. Let ${u}_1=\begin{bmatrix}0 \ 2 \ 2\end{bmatrix}$, ${u}_2=\begin{bmatrix}2 \ 0 \ 0\end{bmatrix}$ and $T=\mathop{span}({u}_1,{u}_2).$ Show $S=T$ by showing $S\subseteq T$ and $T\subseteq S$ where $S$ is defined above.

Exercise. Prove that the non-empty intersection of two subspaces of $\mathbb{R}^3$ is a subspace of $\mathbb{R}^3.$

Exercise. Let $S$ and $T$ be subspaces of $\mathbb{R}^3$ defined by $$ S=\mathop{span}\left(\begin{bmatrix}1 \\ 0 \\ 2\end{bmatrix}, \begin{bmatrix}0 \\ 2 \\ 1\end{bmatrix}\right) \qquad \text{and} \qquad
T=\mathop{span}\left(\begin{bmatrix}2 \\ -2 \\ 3\end{bmatrix}, \begin{bmatrix}3 \\ -4 \\ 4\end{bmatrix}\right). $$ Show they are the same subspace of $\mathbb{R}^3.$

Exercise. Let $\{{v}_1,{v}_2, {v}_3\}$ be a linearly independent set of vectors. Show that if ${v}_4$ is not a linear combination of ${v}_1, {v}_2, {v}_3$, then ${{v}_1,{v}_2, {v}_3},{v}_4$ is a linearly independent set of vectors.

Exercise. If $\{{v}_1,{v}_2, {v}_3\}$ is a linearly independent set of vectors in $\mathbb{V}$, show that $\{{v}_1,{v}_1+{v}_2, {v}_1+{v}_2+{v}_3\}$ is also a linearly independent set of vectors in $\mathbb{V}.$

Exercise. If $\{{v}_1,{v}_2, {v}_3\}$ is a linearly independent set of vectors in $\mathbb{V}$, show that $\{{v}_1 + {v}_2, {v}_2+{v}_3, {v}_3+{v}_1\}$ is also a linearly independent set of vectors in $\mathbb{V}.$

Exercise. Let ${{v}_1,{v}_2, {v}_3}$ be a linearly dependent set. Show that at least one of the ${v}_i$ is a linear combination of the others.

Exercise. Prove or provide a counterexample to the following statement. If a set of vectors $T$ spans the vector space $\mathbb{V}$, then $T$ is linearly independent.

Exercise. Which of the following are not a basis for $\mathbb{R}^3$?

$$ {v_1}=\begin{bmatrix}1 \\ 0 \\ 0\end{bmatrix}, {v_2} = \begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}, {v_3}=\begin{bmatrix}1 \\ -1 \\ -1\end{bmatrix} $$

$$ {u_1}=\begin{bmatrix}0 \\ 0 \\ 1\end{bmatrix}, {u_2}=\begin{bmatrix}1 \\ 0 \\ 1\end{bmatrix}, {u_3}=\begin{bmatrix}2 \\ 3 \\ 4\end{bmatrix} $$

Exercise. Let $S$ be the space spanned by the vectors $$ {v}_1=\begin{bmatrix}1 \\ 0 \\ 1 \\ 1\end{bmatrix} \quad {v}_1=\begin{bmatrix}-1 \\ -3 \\ 1 \\ 0\end{bmatrix} \quad {v}_1=\begin{bmatrix}2 \\ 3 \\ 0 \\ 1\end{bmatrix} \quad {v}_1=\begin{bmatrix}2 \\ 0 \\ 2 \\ 2\end{bmatrix} $$ Find the dimension of $S$ and a subset of $T$ which could serve as a basis for $S.$

Exercise. Let $\{{v}_1, {v}_2, \ldots, {v}_n\}$ be a basis for $\mathbb{V}$, and suppose that ${u} =a_1 {v_1}+a_2 {v_2}+\cdots + a_n {v_n}$ with $a_1\neq 0.$ Prove that $\{{u}, {v}_2, \ldots, {v}_n\}$ is also a basis for $\mathbb{V}.$

Exercise. Let $S=\mathop{span}({v}_1,{v}_2,{v}_3)$ and $T=\mathop{span}({u}_1,{u}_2,{u}_3)$ where ${v}_i$ and ${u}_i$ are defined as follows.

$$ {v}_1=\begin{bmatrix}1 \\ -1 \\ 2 \\ 0\end{bmatrix} \quad {v}_2=\begin{bmatrix}2 \\ 1 \\ 1 \\ 1\end{bmatrix} \quad {v}_3=\begin{bmatrix}3 \\ -1 \\ 2 \\ -1\end{bmatrix}$$

$$ {u}_1=\begin{bmatrix}3 \\ 0 \\ 3 \\ 1\end{bmatrix} \quad {u}_2=\begin{bmatrix}1 \\ 2 \\ -1 \\ 1\end{bmatrix} \quad {u}_3=\begin{bmatrix}4 \\ -1 \\ 5 \\ 1\end{bmatrix} $$

Is one of these two subspaces strictly contained in the other or are they equal?

Exercise. Let $S=\mathop{span}({v}_1,{v}_2,{v}_3)$ where ${v}_i$ are defined as follows. $$ {v}_1=\begin{bmatrix}1 \\ 2 \\ 3 \\ 1\end{bmatrix} \qquad {v}_2=\begin{bmatrix}2 \\ -1 \\ 1 \\ -3\end{bmatrix} \qquad {v}_3=\begin{bmatrix}1 \\ 3 \\ 4 \\ 2\end{bmatrix} \qquad {u}=\begin{bmatrix}1 \\ 2 \\ 3 \\ 1\end{bmatrix} $$ Is the vector ${u}$ in $S$?

Exercise. If possible, find a value of $a$ so that the vectors $$ \begin{bmatrix}1 \\ 2 \\ a\end{bmatrix} \qquad \begin{bmatrix}0 \\ 1 \\ a-1\end{bmatrix} \qquad \begin{bmatrix}3 \\ 4 \\ 5\end{bmatrix} \qquad $$ are linearly independent.

Exercise. Let $S=\mathop{span}({v}_1,{v}_2,{v}_3)$ where ${v}_i$ are defined as follows. $$ {v}_1=\begin{bmatrix}1 \\ -1 \\ 2 \\ 3\end{bmatrix}\qquad {v}_2=\begin{bmatrix}1 \\ 0 \\ 1 \\ 0\end{bmatrix} \qquad {v}_3=\begin{bmatrix}3 \\ -2 \\ 5 \\ 7\end{bmatrix} \qquad \text{and}\qquad {u}=\begin{bmatrix}1 \\ 1 \\ 0 \\ -1\end{bmatrix} $$ Find a basis of $S$ which includes the vector ${u}.$

Exercise. Find a vector ${u}$ in $\mathbb{R}^4$ such that ${u}$ and the vectors $$ {v}_1=\begin{bmatrix}1 \\ -1 \\ -1 \\ 1\end{bmatrix} \qquad {v}_2=\begin{bmatrix}1 \\ 0 \\ 1 \\ 1\end{bmatrix} \qquad {v}_3=\begin{bmatrix}1 \\ 2 \\ 1 \\ 1\end{bmatrix} $$ for a basis of $\mathbb{R}^4.$

Exercise. Show that every subspace of $\mathbb{V}$ has no more than $n$ linearly independent vectors.

Exercise. Find two bases of $\mathbb{R}^4$ that have only the vectors ${e}_3$ and ${e}_4$ in common.

Exercise. Prove that if a list of vectors is linearly independent so is any sublist.

Exercise. Suppose ${v}_1,{v}_2, {v}_3$ and ${v}_1, {v}_2, {v}_4$ are two sets of linearly dependent vectors, and suppose that ${v}_1$ and ${v}_2$ are linearly independent. Prove that any set of three vectors chosen from ${v}_1, {v}_2, {v}_3, {v}_4$ is linearly dependent.

Exercise. If ${u}$ and ${v}$ are linearly independent vectors in $\mathbb{V}$, prove that the vectors $a{u}+b{v}$ and $c{u}+d{v}$ are also linearly independent if and only if $ad-bc\neq 0.$

Exercise. Let $U$ be the collection of vectors that satisfy the equations $x+y+z=0$ and $x+2y-z=0.$ Show $U$ is a subspace of $\mathbb{R}^3$, find a basis for $U$, and find $\dim(U).$

Exercise. Let $U$ be the collection of vectors that satisfy the equations $x+y+z=0$, $x+2y-z=0$, and $y-2z=0.$ Show $U$ is a subspace of $\mathbb{R}^3$, find a basis for $U$, and find $\dim(U).$

Exercise. Show that the only subspaces of $\mathbb{R}$ are ${{0}}$ and ${\mathbb{R}}.$

Exercise. Show that the only subspaces of $\mathbb{R}^2$ are ${{0}}$, ${\mathbb{R}^2}$, and any set consisting of all scalar multiples of a nonzero vector. Describe these subspaces geometrically.

Exercise. Determine the various types of subspaces of $\mathbb{R}^3$ and describe them geometrically.

Exercise. For ${b}\neq{0}$, show that the set of solutions of the $n\times m$ linear system $A {x}={b}$, is not a subspace of $\mathbb{V}.$

Exercise. Suppose that ${v}_1, {v}_2, \ldots, {v}_n$ are linearly independent in $\mathbb{R}^n.$ Show that if $A$ is an $n\times n$ matrix with $\mathop{rref}(A)=I_n$, then $A{v}_1, A{v}_2, \ldots, A{v}_n$ are also linearly independent in $\mathbb{R}^n.$

Exercise. Let $S=\{v_1, v_2, \ldots, v_s \}$ and $T=\{u_1, u_2, \ldots, u_t \}$ be two sets of vectors in $\mathbb{V}$ where each ${u}_i$, $(i=1,2,\ldots,t)$ is a linear combination of the vectors in $S.$ Show that ${w}=a_1 u_1 + a_2 u_2 + \cdots + a_t u_t $ is a linear combination of the vectors in $S.$

Exercise. Let $S=\{v_1, v_2, \ldots, v_m \}$ be a set of non-zero vectors in a vector space $\mathbb{V}$ such that every vector in $\mathbb{V}$ can be uniquely as a linear combination of the vectors in $S.$ Prove that $S$ is a basis for $\mathbb{V}.$

Exercise. Find a basis for the solution space of the homogeneous system $(\lambda I_n-A){x}={0}$ for the given $\lambda$ and $A.$

$$\lambda=1, A= \begin{bmatrix} 0 & 0 & 1 \\ 1 & 0 & -3 \\ 0 & 1 & 3 \end{bmatrix} $$

$$\lambda=2, A= \begin{bmatrix} -2 & 0 & 0 \\ 0 & -2 & -3 \\ 0 & 4 & 5 \end{bmatrix} $$

David A. Smith at Dave4Math

David Smith (Dave) has a B.S. and M.S. in Mathematics and has enjoyed teaching precalculus, calculus, linear algebra, and number theory at both the junior college and university levels for over 20 years. David is the founder and CEO of Dave4Math.

Leave a Comment