{%hackmd 5xqeIJ7VRCGBfLtfMi0_IQ %}
# Is this set linear independent? Case of yes
## Problem
Let $S$ be a set of abstract vectors as follows. Determine if each of them is linearly independent or not and provide your reasons.
1. $S = \left\{
\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix},
\begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix},
\begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}
\right\}$.
2. $S = \left\{
\begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix},
\begin{bmatrix} 1 & 2 \\ 4 & 0 \end{bmatrix},
\begin{bmatrix} 1 & 3 \\ 9 & 0 \end{bmatrix},
\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}
\right\}$.
3. $S = \{1, x-1, (x-1)^2\}$.
## Thought
We will use one of the equivalent definition of linear independence. Recall that a set of vectors $S = \{\bu_1, \ldots, \bu_d\}$ is linearly independent if $c_1\bu_1 + \cdots + c_d\bu_d = \bzero$ for some $c_1,\ldots, c_d\in\mathbb{R}$ only when $c_1 = \cdots = c_d = 0$. Again, it is critical **it is critical to translate the notation into something that you can read** ! One way to do so is the following: If $\bzero$ is a linear combination of the elements of $S$ by some coefficients $c_1, \ldots, c_d\in\mathbb{R}$, then the only possibility is $c_1 = \cdots = c_d = 0$. Whenever $S$ has such a property, then it is linearly independent.
As a result, to say a set $S$ is linearly independent, we will first assume $c_1\bu_1 + \cdots + c_d\bu_d = \bzero$ for some coefficients $c_1,\ldots, c_d\in\mathbb{R}$. And then our goal is to show (by whatever method) that $c_1 = \cdots = c_d = 0$.
## Sample answer
1. Let $\bu_1, \bu_2, \bu_3$ be the vectors in $S$. Suppose $c_1\bu_1 + c_2\bu_2 + c_3\bu_3 = \bzero$ for some coefficients $c_1, c_2, c_3\in\mathbb{R}$. This equation is equivalent to
$$
\begin{aligned}
c_1 + c_2 + c_3 &= 0, \\
c_2 + c_3 &= 0, \\
c_3 &= 0. \\
\end{aligned}
$$
Thus, we know $c_3 = 0$. By the second equation we have $c_2 = 0$. By the first equation we have $c_1 = 0$. Therefore, $S$ is linearly independent.
2. In the space of $2\times 2$ matrices, the zero is $O = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$. Let $A_1, A_2, A_3, A_4$ be the matrices in $S$. Suppose $c_1A_1 + \cdots + c_4A_4 = O$ for some coefficients $c_1, \ldots, c_4\in\mathbb{R}$. This equation is equivalent to
$$
\begin{aligned}
c_1 + c_2 + c_3 \mathbin{\phantom{+}} \phantom{c_4} &= 0, \\
c_1 + 2c_2 + 3c_3 \mathbin{\phantom{+}} \phantom{c_4} &= 0, \\
c_1 + 4c_2 + 9c_3 \mathbin{\phantom{+}} \phantom{c_4} &= 0, \\
c_4 &= 0. \\
\end{aligned}
$$
The last equation gives us $c_4 = 0$, while by solving the first three equations we have $c_1 = c_2 = c_3 = 0$. Therefore, $S$ is linearly independent.
3. In the space of all polynomials, the zero is the zero polynomial $0$. Let $p_1, p_2, p_3$ be the polynomials in $S$. Suppose $c_1p_1 + c_2p_2 + c_3p_3 = 0$ for some coefficients $c_1, c_2, c_3\in\mathbb{R}$. By expanding each polynomial, we have
$$
\begin{aligned}
c_1 - c_2 + c_3 &= 0, \\
c_2 - 2c_3 &= 0, \\
c_3 &= 0. \\
\end{aligned}
$$
By solving the equations we have $c_1 = c_2 = c_3 = 0$, leading to the conclusion that $S$ is linearly independent.
*This note can be found at Course website > Learning resources.*