I begin with 2 by 2 and 3 by 3 linear systems of equations. I then discuss consistent and inconsistent systems of linear equations and work through several examples. The general system of linear equations is defined, and parametrizing a solution set of a linear system is demonstrated. I explain augmented matrices, row operations, and when two linear systems of equations are called equivalent.
Okay, so there is the definition of a determinant, and there are ways of computing the determinant. The definition of determinant involves patterns and a signature, and this approach is useful in proofs. But to determine the determinant in practice, it’s helpful to have expansion by cofactor. In this article, I detail both.
In this article, I begin with the fact that orthonormal vectors are linearly independent and thus form a basis for the subspaces generated by them. After that, I explore the orthogonal projection and properties of the orthogonal complement. Towards the end, I detail the Pythagorean Theorem, the Cauchy-Schwarz Theorem, the Law of Cosines, and the Triangular Inequality Theorem.
In this article, I give an elementary introduction to subspaces. After that, I motivate the concepts of linear independence, spanning set, and basis. Then, I prove that a basis of the subspace gives a unique representation. Towards the end, I explore the dimension of a subspace. You will find many examples and exercises.
I discuss the kernel of a linear transformation and its basic properties. After that, I discuss the image of a linear transformation and its basic properties. Then, I investigate the Rank-Nullity Theorem, which combines the dimension of the image space (rank) and the dimension of the kernel space (nullity) into a single beautiful equation.