# ***Linear Algebra Blog Post 1: Linear Independence*** Linear Indepence is one of the most important properties that can be studied about a set of vectors. To begin, we will formally define what we mean by a Linear Independent set of vectors. Let $S$ be the set defined by $$S = \{\vec v_{1}, \vec v_{2}, ... \vec v_{p} \}$$ where $\vec v_1 =\begin{bmatrix} a \\ b \\ c \end{bmatrix}, \vec v_{2} = \begin{bmatrix} d \\ e \\ f \end{bmatrix}, ... \vec v_{p} = \begin{bmatrix} x \\ y \\ z \end{bmatrix}$ and $a, b, ...., z \in \mathbb{R}$. $S$ is said to be a ***Linearly Independent*** set if no vector $\vec v_{j} \in S$ is a Linear Combination of the remaining vectors in $S$. For instance, consider the following vectors: $$ \vec i=\begin{bmatrix} 1 \\ 0 \\ 0 \\ \end{bmatrix}, \ \vec j = \begin{bmatrix} 0 \\ 1 \\ 0 \\ \end{bmatrix}, \ \vec k= \begin{bmatrix} 0 \\ 0 \\ 1 \\ \end{bmatrix}.$$ The vectors above would form a Linearly Indepedent set because it *impossible* to generate any one of the vetors above using a linear combinaton of the remaing vectors. Put another way, if you pick any vector, none will be contained with in the ***Span*** of the other two vectors. ***Geometric View of an Independent Set of Vecors*** Geometrically speaking since our vectors are in $\mathbb{R}^3$, we see that our statement above is equivlent to saying that any given vector will not be contained within the plane that holds the other two vectors. We see this in the image below. ![](https://i.imgur.com/kn3hEQ3.png) **Linear Independence and the Spanning of a Subspace** Perhaps the most useful application of this concept has to do with the relationship between Linear Independence relationship and Subspaces of $\mathbb{R}^n$, the set of vectors with $n$-entries. By definition, a subspace $H$ of $\mathbb{R}$ is set of vectors in $\mathbb{R}^n$ that contain the zero vector and are closed under vector addition and scalar multiplication. However, we see by this same criteria, there are an infinite amount of vectors that can be grouped together to satifisy these three conditions. Consequently, we are naturally interested in the *smallest* set of vectors that can make up a Subspace $H$. This is precisely where Linear Independence comes in; for the least amount vectors need to produce a subspace $H$ (to span $H$ in other words) must be a linearly independent set. **Exam Question** Suppose that $H$ is a subspace of $\mathbb{R}^n$. What is is the least amount of vectors that $H$ must contain if $H$ spans $\mathbb{R}^n$? What can you conclude about these set of vectors? $$\text{Citations}$$ Lay, D. C. (2006). Linear Algebra and its Applications. Reading: Pearson AddisonWesley Publishing Company MJDMJD 48.9k88 gold badges3535 silver badges5959 bronze badges. (1963, January 1). MathJax Basic Tutorial and Quick Reference. Retrieved from https://math.meta.stackexchange.com/questions/5020/mathjax-basic-tutorial-and-quick-reference Spanning and Liner Independence. (n.d.). Retrieved from http://www.math.jhu.edu/~jmb/note/spanli.pdf Span Linear Independence and Dimension. (n.d.). Retrieved from https://ww w.math.upenn.edu/~moose/240S2013/slides7-18.pdf