---
title: svd_team_3
---
Tim Abade, Dawn Myers, Matt Oakes
***score 29.5/30 great work***
**Section 1**
Example 1: Fill in the gaps
$$\begin{bmatrix} 1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\\end{bmatrix}=\begin{bmatrix}\\ 1 \\ 1 \\ 1 \\ 1 \\ 1 \\ 1 \end{bmatrix} \begin{bmatrix}1 & 1 & 1 & 1 & 1 & 1\end{bmatrix} $$
Example 2: Fill in the gaps
$\begin{bmatrix} a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\\end{bmatrix}=\begin{bmatrix}a\\a \\c \\c \\e \\e \\ \end{bmatrix}\begin{bmatrix}1 & 1 & 1 & 1 & 1 & 1 \end{bmatrix}$
Example 3(a): Fill in the gaps
$\begin{bmatrix} 1&0\\1&1\\\end{bmatrix}=\begin{bmatrix}\\ \\ \end{bmatrix}\begin{bmatrix}\end{bmatrix}$ Cannot be done. Dimension of the column space is 2.
***why dim Col =2 implies "Cannot be done?"***
Example 3(b): Fill in the gaps
$\begin{bmatrix} 1&0\\1&1\\\end{bmatrix}=\begin{bmatrix}1\\ 1\\ \end{bmatrix}\begin{bmatrix}1 & 0\end{bmatrix}+\begin{bmatrix}0\\ 1\\ \end{bmatrix}\begin{bmatrix}0 & 1\end{bmatrix}$
Draw some conclusions.
This matrix has a basis in $\mathbb{R}^{2}$. The vectors span $\mathbb{R}^{2}$ and are linearly independent. We cannot reduce the dimension of this matrix.
It is also not orthogonally diagonalizable. We cannot use spectral decomposition to separate this matrix into two matrices of rank 1. We have to create symmetric matrices from $A$.
**Section 2**
1. Find rank one 2 by 2 matrix $\tilde A$ closest to $A$. Use Frobenius norm. To solve the arising system of equations use
https://www.wolframalpha.com/input/?i=solve+system+of+equations
$$\tilde{A} = \begin{bmatrix} x && y \\ cx && cy \end{bmatrix}$$
$$F = ||\tilde{A} - A||^2_{F} = (x -30)^2 + (cx - 28)^2 + (y - 28)^2 + (cy - 30)^2 $$
Take partial derivatives and set equal to zero and solve the simultaneous system of equations.
$$\frac{\partial F}{\partial x} = 2(x - 30) + 2c(cx -28)=0$$
$$\frac{\partial F}{\partial y} = 2(y - 28) + 2c(cy - 30)=0$$
$$\frac{\partial F}{\partial c} = 2x(cx-28) + 2y(cy -30)=0 $$
Solving the system of equations:
The values of $x,y$ and $c$ closest to $A$ are:
$$ x = 29, y = 29, c = 1$$
Substituting the values into $\tilde{A}$,
$$\tilde{A} = \begin{bmatrix} 29 && 29 \\ 29 && 29 \end{bmatrix}$$
3. Then compute $E=A-\tilde A$, and write $A$ as a sum of $\tilde A$ and $E$.
$A=\begin{bmatrix} 30&28\\28&30\\\end{bmatrix}$
$$E = A - \tilde{A} = \begin{bmatrix} 30&28\\28&30\\\end{bmatrix} - \begin{bmatrix} 29 && 29 \\ 29 && 29 \end{bmatrix} = \begin{bmatrix} 1 && -1 \\ -1 && 1 \end{bmatrix} $$
$$A = \begin{bmatrix} 29 && 29 \\ 29 && 29 \end{bmatrix} + \begin{bmatrix} 1 && -1 \\ -1 && 1 \end{bmatrix} $$
2. How are $\tilde A$ and $E$ are related to the eigenvalues and eigenvectors of $A$? Try to compute $\tilde A$ using eigenvalues and eigenvectors. This is a famous kind of decomposition, called how?
Can this decomposition be obtained for any matrix?
E.g. Example 3.
Eigenvalues of $A$
$$\lambda1=58$$ $$\lambda2=2$$
Eigenvectors of $A$ (orthonormal)
$$P = \begin{bmatrix} 1/\sqrt{2} && - 1/\sqrt{2} \\ 1/\sqrt{2} && 1/\sqrt{2} \end{bmatrix}$$
$$\tilde{A} = 58\begin{bmatrix} 1/\sqrt{2} \\ 1/\sqrt{2}\end{bmatrix} \begin{bmatrix} 1/\sqrt{2} && 1/\sqrt{2}\end{bmatrix} = \begin{bmatrix} 29 && 29 \\ 29 && 29 \end{bmatrix} $$
$$E=2\begin{bmatrix} -1/\sqrt{2} \\ 1/\sqrt{2}\end{bmatrix} \begin{bmatrix} -1/\sqrt{2} && 1/\sqrt{2}\end{bmatrix} = \begin{bmatrix} 1 && -1 \\ -1 && 1 \end{bmatrix} $$
The spectral decomposition of $A$ is $$\lambda_{1}p_{1}p_{1}^T + \lambda_{2}p_{2}p_{2}^T$$ which is equivalent to the sum of $\tilde{A}$ + $E$.
Only diagonalizable matrices can be factorized using spectral decomposition. (Theorem 2, Lay "An $n \times n$ matrix $A$ is orthogonally diagonalizable if and only if $A$ is a symmetric matrix.")
The spectral decomposition of $A$ into $\tilde{A}$ and $E$, the eigenvalue-eigenvector pairs, generates two rank 1 matrices.
**Section 3**
Use Wolphram Alpha for all computations
1. Find rank one 2 by 2 matrix $\tilde A$ closest to $A$. Use Frobenius norm.
$A=\begin{bmatrix} 1&0\\1&1\\\end{bmatrix}$
$$\tilde{A} = \begin{bmatrix} x && y \\ cx && cy \end{bmatrix}$$
$$F = ||\tilde{A} - A||^2_{F} = (x -1)^2 + (cx - 1)^2 + y^2 + (cy - 1)^2 $$
Take partial derivatives and set equal to zero and solve the simultaneous system of equations.
$$\frac{\partial F}{\partial x} = 2(x - 1) + 2c(cx -1)=0$$
$$\frac{\partial F}{\partial y} = 2y + 2c(cy -1)=0$$
$$\frac{\partial F}{\partial c} = 2x(cx - 1) + 2y(cy -1)=0$$
$$x = \frac{1}{10}(5 + \sqrt{5}), y = \frac{1}{\sqrt{5}}, c = \frac{1}{2}(1 + \sqrt{5}) $$
Substituting the values into $\tilde{A}$,
$$\tilde{A} = \begin{bmatrix} \frac{\sqrt{5}}{10} + \frac{1}{2} && \frac{\sqrt{5}}{5} \\ \frac{1}{2} + \frac{3\sqrt{5}}{10} && \frac{\sqrt{5}}{10} + \frac{1}{2} \end{bmatrix}$$
$$\tilde{A} = \begin{bmatrix}
0.72360690 & 0.447213592\\
1.17082040 & 0.723606796
\end{bmatrix}$$
2. Can you compute $\tilde A$ and $E$ using eigenvectors and eigenvalues of $A$?
Eigenvalues of A
$$\lambda_{1}, \lambda_{2} = 1$$
Eigenvectors of A
$$\begin{bmatrix} 0 \\ 1 \end{bmatrix}$$
No, we cannot compute $\tilde{A}$ and $E$ from the eigenvalues and eigenvectors of $A$.
2. Compute the eigenvalues and the unit eigenvectors of $AA^T$ and $A^TA$. What do you notice about the eigenvalues of $A^TA$ and $AA^T$?
$$AA^{T}=\begin{bmatrix} 1&1\\1&2\\\end{bmatrix}, A^{T}A=\begin{bmatrix} 2&1\\1&1\\\end{bmatrix}$$
Eigenvalues for $AA^T$
$$\lambda_{1}=\frac{3}{2} + \frac{\sqrt{5}}{2}$$
$$\lambda_{2}=\frac{3}{2} - \frac{\sqrt{5}}{2}$$
Eigenvalues for $A^TA$
$$\lambda_{1}= \frac{3}{2} + \frac{\sqrt{5}}{2} $$
$$\lambda_{2}=\frac{3}{2} - \frac{\sqrt{5}}{2}$$
The eigenvalues are the same for $A^TA$ and $AA^T$.
3. Let $\lambda_1$ be the largest eigenvalue, and let ${\bf u_1}$ be the unit eigenvector of $AA^T$ corresponding to $\lambda_1$, let ${\bf v_1}$ be the unit eigenvector of $A^TA$ corresponding to $\lambda_1$. Let $\sigma_1$ be the square root of $\lambda_1$. Compute $$A_1=\sigma_1{\bf u_1}{\bf v_1}^T$$
Eigenvectors for $AA^T$
$$\begin{bmatrix} 0.52573111 && -0.85065081 \\ 0.85065081 && 0.52573111 \end{bmatrix}$$
Eigenvectors for $A^TA$ (dispalyed numerically)
$$\begin{bmatrix} 0.85065081 && -0.52573111 \\ 0.52573111 && 0.85065081 \end{bmatrix}$$
$$\sigma_1 = \sqrt{\frac{3}{2}+ \frac{\sqrt{5}}{2}} = 1.61803399$$
$$u_{1} = \begin{bmatrix} 0.52573111 \\ 0.85065081 \end{bmatrix}$$
$$v_1^T = \begin{bmatrix} 0.85065081 && 0.52573111 \end{bmatrix}$$
$$A_1=\sigma_1{\bf u_1}{\bf v_1}^T$$
$$A_1 = \begin{bmatrix}
0.72360690 & 0.447213592\\
1.17082040 & 0.723606796
\end{bmatrix}$$
4. Compare $\tilde A$ and $A_1$
In our case $\tilde A = A_1$.
5. Prove that for any non-zero vectors ${\bf u}$ and ${\bf v}$, the matrix ${\bf u}{\bf v}^T$ has rank 1.
Let ${\bf u}^T = \begin{bmatrix}u_1 \\ u_2 \\ \vdots \\ u_n \end{bmatrix}$ and ${\bf v}^T = \begin{bmatrix}v_1 & v_2 & ... & v_n \end{bmatrix}$ then
$${\bf u}{\bf v}^T = \begin{bmatrix}u_1v_1 & u_1v_2 & ... & u_1v_n \\ u_2v_1 & u_2v_2 & ... & u_2v_n \\ \vdots & \vdots & \ddots & \vdots \\ u_nv_1 & u_nv_2 & ... & u_nv_n \end{bmatrix}$$
All rows are non-zero, scalar multiples of each other therefore we only have one pivot.
Since the vectors are linearly dependent, the matrix will be rank 1.
6. Suppose you have an $m\times n$ matrix $M$. Suggest a linear algebra procedure for obtaining the matrix $M_1$, which is the closest rank 1 matrix to $M$ in Frobenius norm.
The procedure for obtaining the matrix $M_{1}$ involves the singular value decomposition of $M$.
$M = U\sum V^{T}$, where $\sum$ is the diagonal matrix of eigenvalues of $M^{T}M$ in decreasing order with nonzero values in the upper left corner and 0's elsewhere. The matrix $V$ are the corresponding unit eigenvectors and the matrix $U$ are the normalized vectors of $Mv$.
If we denote
$$U_{r} = \begin{bmatrix}
u_{1}, ..., u_{r} \end{bmatrix}, \sum_{r} = diag(\sigma_{1},...,\sigma_{r}), V_{r} = \begin{bmatrix} v_{1}, ..., v_{r} \end{bmatrix}$$
Then we have
$$M = U_{r}\sum V_{r}^{T} = \sum_{i=1}^{r} \sigma_{i}u_{i}v_{i}^{T}.$$
And the nearest rank 1 matrix,
$$M_{1} =\sigma_1{\bf u_1}{\bf v_1}^T$$.