{%hackmd 5xqeIJ7VRCGBfLtfMi0_IQ %} # Is this set linear independent? Case of no ## Problem Let $S$ be a set of abstract vectors as follows. Determine if each of them is linearly independent or not and provide your reasons. 1. $S = \left\{ \begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix}, \begin{bmatrix} -7 \\ 3 \\ 4 \end{bmatrix} \right\}$. 2. $S = \left\{ \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}, \begin{bmatrix} 1 & 1 \\ -1 & -1 \end{bmatrix}, \begin{bmatrix} 2 & 2 \\ 3 & 3 \end{bmatrix}, \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} \right\}$. 3. $S = \{x, x+1, x-1\}$. ## Thought We will use one of the equivalent definition of linear independence. Recall that a set of vectors $S = \{\bu_1, \ldots, \bu_d\}$ is linearly independent if $c_1\bu_1 + \cdots + c_d\bu_d = \bzero$ for some $c_1,\ldots, c_d\in\mathbb{R}$ only when $c_1 = \cdots = c_d = 0$. Again, it is critical **it is critical to translate the notation into something that you can read** ! One way to do so is the following: If $\bzero$ is a linear combination of the elements of $S$ by some coefficients $c_1, \ldots, c_d\in\mathbb{R}$, then the only possibility is $c_1 = \cdots = c_d = 0$. Whenever $S$ has such a property, then it is linearly independent. As a result, to say a set $S$ is NOT linearly independent, we need to find some coefficients $c_1,\ldots, c_d$ such that some of them is nonzero and $c_1\bu_1 + \cdots + c_d\bu_d = \bzero$. A set of such coefficients is called a _certificate_ that _witness_ the linear dependence of $S$. Sometimes we found it by staring at the equations, which is nice. If this does not work, we still need to solve the equation to get a certificate. ## Sample answer 1. Let $\bu_1, \bu_2, \bu_3$ be the vectors in $S$. Suppose, by luck, we have found that $\bu_3$ is a linear comination of $\{\bu_1, \bu_2\}$ such that $\bu_3 = 3\bu_1 + 4\bu_2$. Then we found the certificate $c_1 = 3$, $c_2 = 4$, and $c_3 = -1$ such that $c_1\bu_1 + c_2\bu_2 + c_3\bu_3 = \bzero$. Otherwise, we may assume that $c_1\bu_1 + c_2\bu_2 + c_3\bu_3 = \bzero$. This equation is equivalent to $$ \begin{aligned} -c_1 - c_2 - 7c_3 &= 0, \\ c_1 \mathbin{\phantom{+}} \phantom{c_2} + 3c_3 &= 0, \\ c_2 + 4c_3 &= 0. \\ \end{aligned} $$ Again, this tells us $c_1 = 3$, $c_2 = 4$, and $c_3 = -1$ is a certificate. Therefore, $S$ is linearly dependent. 2. In the space of $2\times 2$ matrices, the zero is $O = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$. Let $A_1, A_2, A_3, A_4$ be the matrices in $S$. We observe that $A_3 = 2.5A_1 - 0.5A_2$. Therefore, $c_1 = 2.5$, $c_2 = -0.5$, $c_3 = -1$, and $c_4 = 0$ is a certificate such that $c_1A_1 + \cdots + c_4A_4 = O$. If this observation is not so obvious, we may assume $c_1A_1 + \cdots + c_4A_4 = O$ for some coefficients $c_1, \ldots, c_4\in\mathbb{R}$. This equation is equivalent to $$ \begin{aligned} c_1 + c_2 + 2c_3 + c_4 &= 0, \\ c_1 + c_2 + 2c_3 \mathbin{\phantom{+}} \phantom{c_4} &= 0, \\ c_1 - c_2 + 3c_3 \mathbin{\phantom{+}} \phantom{c_4} &= 0, \\ c_1 - c_2 + 3c_3 \mathbin{\phantom{+}} \phantom{c_4} &= 0. \\ \end{aligned} $$ Solving the equations gives us at least one solution $c_1 = 2.5$, $c_2 = -0.5$, $c_3 = -1$, and $c_4 = 0$. Therefore, $S$ is linearly dependent. 3. In the space of all polynomials, the zero is the zero polynomial $0$. Let $p_1, p_2, p_3$ be the polynomials in $S$. Suppose $c_1p_1 + c_2p_2 + c_3p_3 = 0$ for some coefficients $c_1, c_2, c_3\in\mathbb{R}$. This is equivalent to $$ \begin{aligned} c_2 - c_3 &= 0, \\ c_1 + c_2 + c_3 &= 0. \\ \end{aligned} $$ By solving the equations we found a certificate $c_1 = -2$, $c_2 = 1$, $c_3 = 1$. Therefore, $S$ is linearly dependent. *This note can be found at Course website > Learning resources.*