# Week 3 Writeup ## Question 1 ### Explanation of Term Similarity Given the matrix \( A \): $$A = \begin{bmatrix} 2 & 3 & 0 & 1 & 1 \\ 4 & 0 & 2 & 1 & 2 \\ 8 & 3 & 2 & 1 & 4 \\ 2 & 2 & 5 & 2 & 1 \end{bmatrix}$$ ### Term Vectors: - **Term 1:** \([2, 3, 0, 1, 1]\) - **Term 2:** \([4, 0, 2, 1, 2]\) - **Term 3:** \([8, 3, 2, 1, 4]\) ### Explanation: The statement claims that "term 1 is more similar to term 2 than it is to term 3" based on the angle between the vectors. To assess this, we use cosine similarity, which measures the cosine of the angle between two vectors. A higher cosine similarity indicates a smaller angle and thus greater similarity. Upon calculating the cosine similarities: - **Cosine similarity between Term 1 and Term 2** is found to be lower than that between Term 1 and Term 3. - **Cosine similarity between Term 1 and Term 3** is higher. This means that Term 1 is actually more similar to Term 3 than it is to Term 2. Therefore, the statement is false. ## Question 2 Given the matrix \( A \): $$ A = \begin{bmatrix} 2 & 3 & 0 & 1 & 1 \\ 4 & 0 & 2 & 1 & 2 \\ 8 & 3 & 2 & 1 & 4 \\ 2 & 2 & 5 & 2 & 1 \end{bmatrix} $$ ### Explanation: - **Matrix Columns:** Each column in matrix \( A \) represents a document in terms of term frequency across different terms. - **Linear Combinations:** The span of the columns of \( A \) is the set of all possible vectors that can be formed by linear combinations of these columns. The statement implies that there exists a collection that can only be formed using all five columns. However, in practice, any vector within the span can be formed using fewer than all available vectors if they are linearly dependent. Since matrix \( A \) has four rows and five columns, it is likely that not all columns are linearly independent. This means that some columns can be expressed as linear combinations of others. Therefore, it is possible to create any vector in the span without necessarily using all five columns. ### Conclusion: The statement is false because the ability to create a specific collection does not require using all five columns if they are not linearly independent. ## Question 3 Given the matrix \( A \): $$ A = \begin{bmatrix} 2 & 3 & 0 & 1 & 1 \\ 4 & 0 & 2 & 1 & 2 \\ 8 & 3 & 2 & 1 & 4 \\ 2 & 2 & 5 & 2 & 1 \end{bmatrix} $$ ### Understanding Linear Independence: Linear independence of a set of vectors (in this case, the columns of the matrix) means that no vector in the set can be written as a linear combination of the others. For \( n \) vectors to be linearly independent in an \( m \times n \) matrix, \( n \) must be less than or equal to \( m \). ### Explanation: - **Matrix Dimensions:** The matrix \( A \) has dimensions \( 4 \times 5 \), meaning it has four rows and five columns. - **Linear Independence Criteria:** To have all columns linearly independent, the number of columns (5) must not exceed the number of rows (4). Since there are more columns than rows, it is impossible for all columns to be linearly independent. In linear algebra, if a matrix has more columns than rows, the columns cannot all be linearly independent because there are not enough dimensions to accommodate each vector independently. ### Conclusion: The statement that "the columns of this matrix are all linearly independent" is false because the matrix has more columns than rows, which inherently means that some columns must be linearly dependent. ## Question 4 Given the matrix \( A \): $$A = \begin{bmatrix} 2 & 3 & 0 & 1 & 1 \\ 4 & 0 & 2 & 1 & 2 \\ 8 & 3 & 2 & 1 & 4 \\ 2 & 2 & 5 & 2 & 1 \end{bmatrix}$$ ### Understanding Singularity: A matrix is considered singular if it does not have an inverse. For square matrices, this is determined by calculating the determinant: if the determinant is zero, the matrix is singular. However, for non-square matrices, such as this \( 4 \times 5 \) matrix, the concept of a determinant does not apply. ### Explanation: - **Matrix Dimensions:** Matrix \( A \) is not square; it has more columns (5) than rows (4). - **Determinant Calculation:** Determinants are only defined for square matrices. Therefore, it is not possible to calculate a determinant for matrix \( A \). Since the determinant cannot be calculated for non-square matrices, determining singularity through this method is not applicable. ### Conclusion: The statement that "this is a singular matrix, and this can be determined by calculating the determinant of the matrix" is false because the matrix is not square, and thus a determinant cannot be calculated. ## Question 5 ### Situation Overview: We have a matrix $M$ representing survey responses, where each element is on a scale from 1 to 5: $$ M = \begin{bmatrix} 2 & 5 & 1 & 2 & 4 \\ 5 & 2 & 1 & 2 & 3 \\ 1 & 2 & 1 & 3 & 1 \\ 5 & 4 & 1 & 5 & 4 \\ 1 & 4 & 3 & 2 & 1 \end{bmatrix} $$ This matrix is transformed into \( M_2 \) by subtracting 3 from each element, resulting in: $$ M_2 = \begin{bmatrix} -1 & 2 & -2 & -1 & 1 \\ 2 & -1 & -2 & -1 & 0 \\ -2 & -1 & -2 & 0 & -2 \\ 2 & 1 & -2 & 2 & 1 \\ -2 & 1 & 0 & -1 & -2 \end{bmatrix} $$ Further transformation of \( M_2 \) into \( M_3 \) involves dividing each element by 2, resulting in: $$ M_3 = \begin{bmatrix} -0.5 & 1.0 & -1.0 & -0.5 & 0.5 \\ 1.0 & -0.5 & -1.0 & -0.5 & 0.0 \\ -1.0 & -0.5 & -1.0 & 0.0 & -1.0 \\ 1.0 & 0.5 & -1.0 & 1.0 & 0.5 \\ -1.0 & 0.5 & 0.0 & -0.5 & -1.0 \end{bmatrix} $$ ### Determinant Calculation: - **Determinant of Matrix \( M \):** Calculated as \(36.0\). - **Determinant of Matrix \( M_3 \):** Calculated as approximately \(-5.4375\). ### Question Statement: "The determinant of $M_3$ is negative *because* we subtracted a constant value (3) from all the values in matrix $M$. In other words, when you subtract a constant value from all elements in a matrix, the sign of the determinant will change." ### Explanation: - **Matrix Transformation:** Subtracting a constant from each element of a matrix does not directly affect the determinant's sign or value in the way described by the statement. - **Determinant Properties:** The determinant is affected by row operations and scaling transformations but not by uniformly subtracting a constant across all elements. - **Transformation Effects:** The transformation from $M_2$ to $M_3$, which includes division by a scalar (in this case, dividing by two), influences the determinant's magnitude and potentially its sign, but not merely due to the subtraction operation. ### Conclusion: The statement is false because subtracting a constant value from all elements in a matrix does not inherently change the sign of its determinant. The negative determinant of $M_3$ results from specific transformations and scaling factors applied during the process, particularly the division by two, not merely from subtracting three. ## Question 6 We have a matrix $M$ representing survey responses, where each element is on a scale from 1 to 5: $$ M = \begin{bmatrix} 2 & 5 & 1 & 2 & 4 \\ 5 & 2 & 1 & 2 & 3 \\ 1 & 2 & 1 & 3 & 1 \\ 5 & 4 & 1 & 5 & 4 \\ 1 & 4 & 3 & 2 & 1 \end{bmatrix} $$ ### Question Statement: The statement is "Matrix $M$ is invertible." ### Explanation: - **Invertibility Criterion:** A square matrix is invertible if and only if its determinant is non-zero. The determinant provides a scalar value that indicates whether the matrix has an inverse. - **Calculation of Determinant:** The determinant of matrix $M$ is calculated as follows: $$\text{det}(M) = 36.0$$ - **Interpretation:** Since the determinant of \( M \) is \(36.0\), which is non-zero, matrix $M$ is indeed invertible. ### Conclusion: The statement that "matrix $M$ is invertible" is true because the determinant of $M$ is non-zero (\(36.0\)), confirming that the matrix has an inverse. ## Question 7 We have transformed the original matrix $M$ into matrix $M_3$ through a series of operations. The final matrix $M_3$ is: $$ M_3 = \begin{bmatrix} -0.5 & 1.0 & -1.0 & -0.5 & 0.5 \\ 1.0 & -0.5 & -1.0 & -0.5 & 0.0 \\ -1.0 & -0.5 & -1.0 & 0.0 & -1.0 \\ 1.0 & 0.5 & -1.0 & 1.0 & 0.5 \\ -1.0 & 0.5 & 0.0 & -0.5 & -1.0 \end{bmatrix} $$ ### Question Statement: The statement is "The columns of $M_3$ are all linearly independent." ### Explanation: - **Linear Independence Criterion:** A set of vectors (columns, in this case) is linearly independent if no vector can be written as a linear combination of the others. - **Determinant and Linear Independence:** For a square matrix, if the determinant is non-zero, it implies that the columns are linearly independent. - **Calculation of Determinant:** The determinant of matrix $M_3$ is calculated as approximately: $$\text{det}(M_3) = -5.4375$$ - **Interpretation:** Since the determinant of $M_3$ is non-zero $(-5.4375)$, this indicates that the columns of $M_3$ are indeed linearly independent. ### Conclusion: The statement that "the columns of $M_3$ are all linearly independent" is true because the determinant of $M_3$ is non-zero, confirming that its columns cannot be expressed as linear combinations of each other. ## Question 8 ### Situation Overview: We start with matrix $M$: $$ M = \begin{bmatrix} 2 & 5 & 1 & 2 & 4 \\ 5 & 2 & 1 & 2 & 3 \\ 1 & 2 & 1 & 3 & 1 \\ 5 & 4 & 1 & 5 & 4 \\ 1 & 4 & 3 & 2 & 1 \end{bmatrix} $$ The goal is to transform $M$ into matrix $M_3$: $$ M_3 = \begin{bmatrix} -0.5 & 1.0 & -1.0 & -0.5 & 0.5 \\ 1.0 & -0.5 & -1.0 & -0.5 & 0.0 \\ -1.0 & -0.5 & -1.0 & 0.0 & -1.0 \\ 1.0 & 0.5 & -1.0 & 1.0 & 0.5 \\ -1.0 & 0.5 & 0.0 & -0.5 & -1.0 \end{bmatrix} $$ ### Explanation: - **Transformation Steps:** 1. **Subtract a Constant:** - Subtracting the constant value of $(3)$ from each element in $M$ results in matrix $M_2$: $$ M_2 = M - 3I = \begin{bmatrix} -1 & 2 & -2 & -1 & 1 \\ 2 & -1 & -2 & -1 & 0 \\ -2 & -1 & -2 & 0 & -2 \\ 2 & 1 & -2 & 2 & 1 \\ -2 & 1 & 0 & -1& -2 \end{bmatrix} $$ where $I$ is the identity matrix scaled by \(3\). 2. **Scale the Matrix:** - Divide each element of \( M_2 \) by \(2\) to obtain \( M_3 \): $$ M_3 = \frac{M_2}{2} = \begin{bmatrix} -0.5 & 1.0 & -1.0 & -0.5& 0.5\\ 1.0&-0.5&-1.0&-0.5&0\\ -1&-0.5&-1&0&-1\\ 1&0.5&-1&1&0\\ -1&-0..5&-01&-05&-10 \end{bmatrix} $$ ### Conclusion: The statement that "Matrix $M_3$ can be created from $M$ using simple matrix operations" is true because we can transform $M$ into $M_3$ by subtracting a scaled identity matrix and then scaling the result.