---
title: svd_team_2, Chris Schmidt, Bryten Ives
---
***great work. 30/30***
**Section 1**
> [no a's on the left in Example 1]
Example 1: Fill in the gaps
$\begin{bmatrix} 1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\1&1&1&1&1&1\\\end{bmatrix}=\begin{bmatrix}\\1\\1\\1\\1\\1\\1\end{bmatrix}[~\ 1~~\ 1~~\ 1~~\ 1~~\ 1 ~~\ 1 ~ ]$
Example 2: Fill in the gaps
$\begin{bmatrix} a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\a&a&c&c&e&e\\\end{bmatrix}=\begin{bmatrix}\\a \\a \\c \\ c\\e \\e \end{bmatrix}[~\ 1~~\ 1~~\ 1 ~~\ 1~~\ 1~~\ 1 ~ ]$
Example 3(a): Fill in the gaps
$\begin{bmatrix} 1&0\\1&1\\\end{bmatrix}=\begin{bmatrix}\ ? \\1 \end{bmatrix}[~\ 1~~\ ?~~ ]$ No solution is possible.
This can not be solved because the rank of $A=2$. Since the outer product of $u_1v^T$ is a rank one matrix, this is insufficient to create a rank two matrix without adding one more rank one matrx.
Example 3(b): Fill in the gaps
$\begin{bmatrix} 1&0\\1&1\\\end{bmatrix}=\begin{bmatrix}\ 1\\1 \end{bmatrix}[~\ 1~~\ 1~~ ]+\begin{bmatrix}\ 1 \\ 0 \end{bmatrix}[~\ 0~~\ -1~~]$
Draw some conclusions.
The rank of the matrix you are decomposing will be equal to the sum of the ranks of the decomposition pieces.
For a matrix of rank two you have to use a combination of other rank one matrices.
**Section 2**
1. Find rank one 2 by 2 matrix $\tilde A$ closest to $A$. Use Frobenius norm. To solve the arising system of equations use
https://www.wolframalpha.com/input/?i=solve+system+of+equations
The Frobenius norm of $||A - \tilde{A}||$ is found by solving the square root of the sum of the squared differences between $A$ and $\tilde{A}$. Here we solve for $\tilde{A}$ by setting $A$ to be a rank matrix of unknowns, such as $\begin{bmatrix} a&a\\cb&cb\\\end{bmatrix} where $c$ is a scalar of $b$ in the matrix $\tilde{A}$.
First, we take the norm of the distance between $A$ and set each partial derivative equal to zero to get the minimal distance. We solve for the values a,b for the entries of $\tilde{A}$. This is also a rank one matrix. Solving this, we find that $a=29, b=29$.
Then compute $E$= $A-\tilde{A}$, and write $A$ as a sum of $\tilde{A}$ and $E$.
$$E = \text{ A is }\begin{bmatrix} 30&28\\28&30\\\end{bmatrix} - \tilde{A}\text{ is } \begin{bmatrix} 29&29\\29&29\\\end{bmatrix}$$
$$E = \begin{bmatrix} 1&-1\\-1&1\\\end{bmatrix}$$
2. How are $\tilde A$ and $E$ are related to the eigenvalues and eigenvectors of $A?$ Try to compute $\tilde A$ using eigenvalues and eigenvectors. This is a famous kind of decomposition, called what? Spectral Decomposition
Can this decomposition be obtained for any matrix? It can for any symmetric, square matrix.
Eigenvalues of $A$ are $\lambda=58, \lambda=2$
Corresponding normalized eigenvectors are
$$v_1 = \begin{bmatrix} 1/\sqrt{2}\\1/\sqrt{2}\\\end{bmatrix}, v_2 \begin{bmatrix} -1/\sqrt{2}\\1/\sqrt{2}\\\end{bmatrix}$$ respectively.
The Frobenius norm of the matrix $A$ is $\sqrt{30^2 + 28^2 + 30^2 + 28^2} = 58.0344...$, which is also the largest eigenvalue.The diagonals of $\tilde{A}$ are the eigenvalue of $A$. The matrix $E$ is the eigenvectors of $A$.
$$A= \lambda_1 v_1 v_1^T+\lambda_2 v_2 v_2^T$$
$$\text{ then } A= 58 \cdot \begin{bmatrix}1/\sqrt{2}\\ 1/\sqrt{2} \end{bmatrix}\begin{bmatrix} 1/\sqrt{2} & 1/\sqrt{2}\end{bmatrix} + 2\cdot\begin{bmatrix} -1/\sqrt{2}\\ 1/\sqrt{2}\end{bmatrix} \begin{bmatrix} -1/\sqrt{2}& 1/\sqrt{2}\end{bmatrix}$$
$$=58\cdot\begin{bmatrix} 1/2&1/2\\1/2&1/2\end{bmatrix} + 2\cdot\begin{bmatrix} 1/2&-1/2\\-1/2&1/2\end{bmatrix} = \begin{bmatrix} 30&28\\28&30\\\end{bmatrix}$$
**Section 3**
**Here is a link to our Colab notebook with some of the computations used for this assignment: https://colab.research.google.com/drive/1qT4XihL0U-_Z00loFBceVQMQ_O5R5JTe?usp=sharing**
Use Wolfram Alpha for all computations
1. Find rank one 2 by 2 matrix $\tilde A$ closest to $A$. Use Frobenius norm. To find $||A - \tilde{A}||$?
$$A=\begin{bmatrix} 1&0\\1&1\\\end{bmatrix}$$
$||A||_F = \sqrt{1^2 + 0^2 + 1^2 + 1^2} = \sqrt{3} = 1.73205080757$
$||A - \tilde{A}||_F= ||\begin{bmatrix} 1&0\\1&1\\\end{bmatrix} - \begin{bmatrix} 0.7236068&0.4472136\\1.17082039&0.7236068\\\end{bmatrix}||_F = ||\begin{bmatrix} 0.2763932&-0.4472136\\-.17082039&0.2763932\\\end{bmatrix}||_F$
$=\sqrt{0.2763932^2 + (-0.4472136)^2 + (-.17082039)^2 + 0.2763932^2}=0.6180339887498948$
Eckhart-Young states that if $B$ is the SVD of $A$ then for any rank $k$ matrix $A_k$ we have
$$\sqrt{\sum_{i=k+1}^n \sigma_i^2} = ||B - U_k\Sigma_kV_k^T||_F \leq ||B-A||_F$$.
Solve $\tilde{A}=u_1 \cdot \sigma_1 \cdot v_1^T$
$$= \begin{bmatrix}\frac{1+\sqrt{5}}{2\sqrt{.25(1+\sqrt{5})^2 +(1+.5(1+\sqrt{5}))^2}} & \frac{1-\sqrt{5}}{2\sqrt{.25(1-\sqrt{5})^2 +(1+.5(1-\sqrt{5}))^2}}\\ \frac{1+.5(1+\sqrt{5})}{\sqrt{.25(1+\sqrt{5})^2+(1+.5(1+\sqrt{5}))^2}} & \frac{1+.5(1-\sqrt{5})}{\sqrt{.25(1-\sqrt{5})^2 +(1+.5(1-\sqrt{5}))^2}}\end{bmatrix} \begin{bmatrix}\sqrt{\frac{3+\sqrt{5}}{2}} & 0\\0 & 0 \end{bmatrix} \begin{bmatrix}\frac{1+\sqrt{5}}{2\sqrt{1+.25(1+\sqrt{5})^2}} & \frac{1-\sqrt{5}}{2\sqrt{1+.25(1-\sqrt{5})^2}}\\\frac{1}{\sqrt{1+.25(1+\sqrt{5})^2}} & \frac{1}{\sqrt{1+.25(1-\sqrt{5})^2}} \end{bmatrix}$$
This is a mess to solve and retain the solution in quotient form but using python we generated;
$$\tilde{A} = \begin{bmatrix} 0.7236068&0.4472136\\1.17082039&0.7236068\\\end{bmatrix}$$
2. Can you compute $\tilde A$ and $E$ using eigenvectors and eigenvalues of $A$?
We can get $A$ from $\tilde{A}$ and $E$ by taking the eigenvector corresponding to the largest one. $A=\tilde{A}$ plus $E$ where $E$ is the error.
We have that $E = A-\tilde{A}= \begin{bmatrix} 1&0\\1&1\\\end{bmatrix}- \begin{bmatrix} 0.7236068&0.4472136\\1.17082039&0.7236068\\\end{bmatrix} = \begin{bmatrix} 0.2763932&-0.4472136\\-.17082039&0.2763932\\\end{bmatrix}$
Eigenvalues of $A$ are $\lambda=1$ with multiplicity $2$.
The eigenvector of $A$ corresponding to $\lambda=1$ is \begin{bmatrix} 0\\1\\ \end{bmatrix}
We can compute $\tilde{A}$ and $E$ from these values using spectral decomposition but I am not sure how to get there although this looks like a pathway but it doesn't seem to make sense.
$\lambda_1 v_1 v_1^T+\lambda_2 v_2 v_2^T = 1 \cdot \begin{bmatrix}0&0 \\1&0\end{bmatrix} +1 \cdot \begin{bmatrix}0&0 \\1&0\end{bmatrix}$
2. Compute the eigenvalues and the unit eigenvectors of $AA^T$ and $A^TA$. What do you notice about the eigenvalues of $A^TA$ and $AA^T$?
$AA^T$ $= \begin{bmatrix} 1&1\\1&2\\\end{bmatrix}$
$A^TA$ $= \begin{bmatrix} 1&1\\1&2\\\end{bmatrix}$
Eigenvalues of these matrices are $\lambda_1 = \frac{3+\sqrt{5}}{2}$, and $\lambda_2 = \frac{3-\sqrt{5}}{2}$. We notice that the eigenvalues of $AA^T$ and $A^TA$ are equal.
Eigenvectors of this matrix are \begin{bmatrix} -1+ \sqrt{5}\\2\\\end{bmatrix} and \begin{bmatrix} -1- \sqrt{5}\\2\\\end{bmatrix} respectively.
3. Let $\lambda_1$ be the largest eigenvalue, and let ${\bf u_1}$ be the unit eigenvector of $AA^T$ corresponding to $\lambda_1$, let ${\bf v_1}$ be the unit eigenvector of $A^TA$ corresponding to $\lambda_1$. Let $\sigma_1$ be the square root of $\lambda_1$. Compute
$$A_1=\sigma_1{\bf u_1}{\bf v_1}^T$$
We have a very messy set of computations to do:
$$A_1=\sqrt{\frac{3+\sqrt{5}}{2}}\cdot \begin{bmatrix} -1+ \sqrt{5}\\2\\\end{bmatrix}\cdot \begin{bmatrix} -1+ \sqrt{5},&2\\\end{bmatrix} $$
$$= \begin{pmatrix}\sqrt{\frac{3+\sqrt{5}}{2}}\left(6-2\sqrt{5}\right)&\sqrt{\frac{3+\sqrt{5}}{2}}\left(\sqrt{5}-1\right)\cdot \:2\\ \sqrt{2}\left(\sqrt{5}-1\right)\sqrt{3+\sqrt{5}}&2\sqrt{2}\sqrt{3+\sqrt{5}}\end{pmatrix}$$
$$A_1 =\begin{bmatrix}0.7236068 & 0.44721359\\1.1708204 & 0.7236068\\ \end{bmatrix}$$
*Final matrix found using Python.
largest eigenvalue for $AA^T$ and $A^TA$ are the same, use which one? Use singular values which equal the analog of the spectral decomposition of the matrix for a non square matrix
4. Compare $\tilde A$ and $A_1$
$$A_1 =\begin{bmatrix}0.7236068 & 0.44721359\\1.1708204 & 0.7236068\\ \end{bmatrix}$$
$$\tilde{A} = \begin{bmatrix} 0.7236068&0.4472136\\1.17082039&0.7236068\\\end{bmatrix}$$
$\tilde{A}$ = $A_1$
5. Prove that for any non-zero vectors ${\bf u}$ and ${\bf v}$, the matrix ${\bf u}{\bf v}^T$ has rank 1.
Let $u\in R^m$ and $v\in R^n$ and $u,v\neq 0$. Then, we have ${\bf u}{\bf v}^T$ is an $m$x$n$ matrix. This may be written as ${\bf u}{\bf v}^T$= $\begin{bmatrix} u_1\\u_2\\\cdots\\u_m\end{bmatrix}$$\cdot$ $\begin{bmatrix} v_1 & v_2 & \cdots & v_n\end{bmatrix}$
Then we can see that each column is linearly independent and thus $\text{rank}(uv^T)= \text{max(rank}(u), \text{rank}(v^T))$ which equals $1$, since we assumed that neither are a zero vector.
6. Suppose you have an $m\times n$ matrix $M$. Suggest a linear algebra procedure for obtaining the matrix $M_1$, which is the closest rank $1$ matrix to $M$ in Frobenius norm.
We can use the matrix $M$'s singular value decomposition. We know that singular values of $M$ are the square roots of the eigenvalues of $M^TM$, denoted by ${\sigma_i}^2$
By definition, the Frobenius norm is equal to the square root of the sum of those eigenvalues, from $i=1$ to $r$, in this case, r$=1$.