--- title: Week-3 Writeup tags: [onboarding] --- # Week-3 Writeup ## Situation 1 ### 1.1 Term Similarity **Question:** If we define "term similarity" based on the cosine of the angle drawn between term-vectors, then term 1 is more similar to term 2 than it is to term 3. **Answer:** (<span style="color: red; font-weight: bold;">FALSE</span>) Using the Cosine distance function on the matrix in WolframeAlpha the cosine distance of the cosine similarity between Term 1 and Term 2 is 0.546, and the cosine distance of the cosine similarity between Term 1 and Term 7 is 0.804. **Explaination:** $Matrix$ $A$ $$\begin{bmatrix} 2 & 3 & 0 & 1 \\ 4 & 0 & 2 & 2 \\ 8 & 3 & 2 & 4\\ 2 & 2 & 5 & 1\end {bmatrix}$$ - "Term" Similarity == Cosine Similarity **Cosine Similarity Formula:** $$\cos(\theta) = \frac{u \cdot v}{|u| \cdot |v|}$$ **Calculating cosine similarity between Term 1 and Term 2:** - $Dot$ $product:$ $(2)(4)+(3)(0)+(0)(2)+(1)(2)=8+0+0+2=10$ - $Magnitude$ $of$ $Term$ $1:$ $$|u| = \sqrt{2^2 + 3^2 + 0^2 + 1^2} = \sqrt{14}$$ - $Magnitude$ $of$ $Term$ $2:$ $$|v| = \sqrt{4^2 + 0^2 + 2^2 + 2^2} = \sqrt{24}$$ - $Cosine$ $Similarity:$ $$\cos(\theta) = \frac{10}{\sqrt{14} \cdot \sqrt{24}} = \frac{10}{\sqrt{336}} \approx 0.546$$ **Calculating cosine similarity between Term 1 and Term 3:** - $Dot$ $product:$ $(2)(8)+(3)(3)+(0)(2)+(1)(4) = 16+9+0+4 = 29$ - $Magnitude$ $of$ $Term$ $1:$ $$|u| = \sqrt{2^2 + 3^2 + 0^2 + 1^2} = \sqrt{14}$$ - $Magnitude$ $of$ $Term$ $3:$ $$|w| = \sqrt{8^2 + 3^2 + 2^2 + 4^2} = \sqrt{93}$$ - $Cosine$ $Similarity:$ $$\cos(\theta) = \frac{29}{\sqrt{14} \cdot \sqrt{93}} = \frac{29}{\sqrt{1302}} \approx 0.804$$ **Conclusion:** Since $0.804 > 0.546$, Term 1 is MORE similar to Term 3 than to Term 2, making the statement false. --- ### 1.2 Special Collection **Question:** There exists a "special collection" of documents which can only be created by linear combinations of all four columns of the matrix. **Answer:** (<span style="color: red; font-weight: bold;">FALSE</span>) If any column can be written as a linear combination of the other columns, then you don't need all four columns to span the column space. You could use fewer columns. It would be redundant. **Explaination:** The matrix has redundant columns. The fourth column equals half of the first column ($\vec{v}_4 = \frac{1}{2}\vec{v}_1$), and doesn't add new information to the column space. When columns are linearly dependent, the column space can be spanned by fewer columns, so only three or less are needed to generate the entire space. Since any vector in the column space can be created without using all four columns, there cannot exist a "special collection" that requires all four columns. Any linear combination involving all four can be rewritten using only the independent columns, making the redundant column unnecessary and the claim false. --- ### 1.3 Linear Independence **Question:** The columns of this matrix are all linearly independent. **Answer:** (<span style="color: red; font-weight: bold;">FALSE</span>) Not all columns of the matrix are linearly independent because the fourth column is equal to half of the first column. **Explaination:** Matrix A $$\begin{bmatrix} 2 & 3 & 0 & 1 \\ 4 & 0 & 2 & 2 \\ 8 & 3 & 2 & 4 \\ 2 & 2 & 5 & 1 \end{bmatrix}$$ $$1 \cdot \mathbf{c}_1 + 0 \cdot \mathbf{c}_2 + 0 \cdot \mathbf{c}_3 + (-2) \cdot \mathbf{c}_4 = \mathbf{0}$$ $$\begin{bmatrix} 2 \\ 4 \\ 8 \\ 2 \end{bmatrix} - 2 \begin{bmatrix} 1 \\ 2 \\ 4 \\ 1 \end{bmatrix} = \begin{bmatrix} 2-2 \\ 4-4 \\ 8-8 \\ 2-2 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \\ 0 \end{bmatrix}$$ Since not all coefficients are zero, **the columns are linearly dependent**. --- ### 1.4 Invertible **Question:** This matrix also represents an invertible linear transformation. That is, it does NOT "squash" a 4-dimensional vector into a 3-dimensional space. **Answer:** (<span style="color: red; font-weight: bold;">FALSE</span>) Because there is Linear Dependency, the fourth column of the Matrix is not needed, but because it still exists in the fourth dimension, it can still be squashed down to the 3rd dimension. **Explaination:** It can be squashed because there is a dependency. It still is a fourth-dimensional matrix since each column has 4 number inputs into each, but because that fourth column is equal to half of the first, then there are essentially only 3 vectors. Mathematically: The matrix maps from $\mathbb{R}^4 \to \mathbb{R}^4$, but since $\mathbf{c}_4 = \frac{1}{2}\mathbf{c}_1$, we have: $$\text{rank}(A) = 3 < 4$$ Because it is still in the 4th dimension it can still be squashed to the 3rd. The transformation takes 4D vectors as input, but the output (column space) only spans a 3D subspace of $\mathbb{R}^4$, losing one dimension in the process. ## Situation 2 ### 2.1 Determinant **Question:** The determinant of $M_3$ is negative because we subtracted a constant value (3) from all the values in matrix $M$. In other words, when you subtract a constant value from all elements in a matrix, the sign of the determinant will change. **Answer:** (<span style="color: red; font-weight: bold;">FALSE</span>) This is not true because subtracting a constant from all matrix elements can completely change both the magnitude and the sign of the determinant in unpredictable ways. The determinant of M₃ is negative because det(M₂) was already negative (after subtracting 3 from M), and scaling M₂ by 1/2 preserved that negative sign while changing the magnitude. **Explaination:** contradiction case: Let $M = \begin{bmatrix} 3 & 1 \\ 1 & 3 \end{bmatrix}$ $$\det(M) = (3)(3) - (1)(1) = 9 - 1 = 8$$ Now subtract 1 from all elements: $M_2 = \begin{bmatrix} 2 & 0 \\ 0 & 2 \end{bmatrix}$ $$\det(M_2) = (2)(2) - (0)(0) = 4$$ Both determinants are positive, so the sign didn't change at all. This shows that subtracting a constant from all matrix elements doesn't always flip the sign of the determinant, it depends on the specific values in the matrix. --- ### 2.2 Identity **Question:** There exists a matrix $I$ so that $MI = M$ and $IM = M$ where $I$ is the identity matrix. **Answer:** (<span style="color: lightgreen; font-weight: bold;">TRUE</span>) If a matrix is independent and the determinant is not 0, then it is also invertible. **Explaination:** $\det(M) \neq 0 \iff M \text{ is invertible} \iff M^{-1} \text{ exists}$ --- ### 2.3 Linear Independence **Question:** The columns of $M_3$ are all linearly independent. **Answer:** (<span style="color: lightgreen; font-weight: bold;">TRUE</span>) $M_3$ is Linearly Independent because the sum of its determinants is not equal to 0 **Explaination:** $M_3$ is Linearly Independent because its determinant is not equal to 0. We transform $M$ to create $M_2$ and then $M_3$: Original Scale: Create $M_2$ (subtract 3 from each value) $$M_2 = \begin{bmatrix} -1 & 2 & -2 & -1 & 1 \\ 2 & -1 & -2 & -1 & 0 \\ -2 & -1 & -2 & 0 & -2 \\ 2 & 1 & -2 & 2 & 1 \\ -2 & 1 & 0 & -1 & -2 \end{bmatrix}$$ $$\det(M_2) = -173.99999999999997 \neq 0$$ Create $M_3$ (divide $M_2$ by 2) $$M_3 = \begin{bmatrix} -0.5 & 1 & -1 & -0.5 & 0.5 \\ 1 & -0.5 & -1 & -0.5 & 0 \\ -1 & -0.5 & -1 & 0 & -1 \\ 1 & 0.5 & -1 & 1 & 0.5 \\ -1 & 0.5 & 0 & -0.5 & -1 \end{bmatrix}$$ $$\det(M_3) = -5.437499999999999 \neq 0$$ Since $\det(M_3) = -5.4375 \neq 0$, we can conclude: $$\det(M_3) \neq 0 \iff M_3 \text{ is invertible} \iff \text{columns of } M_3 \text{ are linearly independent}$$ Therefore, the columns of $M_3$ are **linearly independent**. --- ### 2.4 Matrix Operations **Question:** $M_3$ can be created from $M$ using basic matrix operations (matrix subtraction, matrix/scalar multiplication, etc.). **Answer:** (<span style="color: lightgreen; font-weight: bold;">TRUE</span>) $M_3$ was created by scaling $M_2$ by half, and $M_2$ was created by subtracting all the values of $M$ by 3 **Explaination:** Original Scale: $$M = \begin{bmatrix} 2 & 5 & 1 & 2 & 4 \\ 5 & 2 & 1 & 2 & 3 \\ 1 & 2 & 1 & 3 & 1 \\ 5 & 4 & 1 & 5 & 4 \\ 1 & 4 & 3 & 2 & 1 \end{bmatrix}$$ Create $M_2$ (subtract 3 from each value): $$M_2 = \begin{bmatrix} -1 & 2 & -2 & -1 & 1 \\ 2 & -1 & -2 & -1 & 0 \\ -2 & -1 & -2 & 0 & -2 \\ 2 & 1 & -2 & 2 & 1 \\ -2 & 1 & 0 & -1 & -2 \end{bmatrix}$$ Create $M_3$ (divide $M_2$ by 2): $$M_3 = \begin{bmatrix} -0.5 & 1 & -1 & -0.5 & 0.5 \\ 1 & -0.5 & -1 & -0.5 & 0 \\ -1 & -0.5 & -1 & 0 & -1 \\ 1 & 0.5 & -1 & 1 & 0.5 \\ -1 & 0.5 & 0 & -0.5 & -1 \end{bmatrix}$$