---
title: Team 2
---
***wondeful, 30/30***
Chris Schmidt, Steven White
### Problem 1
The four points are $v_1=(1,2), v_2=(2,1), v_3=(3,4), v_4=(4,3)$. They are in 2D. We want to reduce dimensionality to one. The result should be four points in $\mathbb {R}^1$: $p_1$, $p_2$, $p_3$, $p_4$.
**Predict the answer**
1. By visual inspection find a hyperplane $H$ that the four points are "closest to."
The hyperplane is $y = x$
2. Find projections of the four points on $H$.
Midpoint formula:$(1.5,1.5)$ for $(1,2)$ and $(2,1)$
$(3.5, 3.5)$ for $(3,4)$ and $(4,3)$
3. Find $p_1$, $p_2$, $p_3$, $p_4$.
$p_1=p_2=\sqrt{4.5}$ ,$p_3 = p_4 = \sqrt{3.5^2 + 3.5^2}$
**Mathematize your approach**
1. $H$ is uniquely defined by a vector ${\bf x}:=(x_1,x_2)$ that we need to find. Let us assume that ${\bf x}$ has unit length. Set up a minimization problem that uses $v_1,v_2,v_3,v_4$ as knowns and ${\bf x}$ as an unknown. This should reflect the fact that $H$ is the hyperplane that the four points are "closest to." You should be able to recognize the mathematical objects that we have studied in this course. Name them. (add-in) Show how to obtain the rotated data shown as red squares in all figs.
You will end up with
$$\min_{x_1,x_2} F(x_1, x_2)$$
subject to the constraint $x_1^2+x_2^2=1$
Note that inner product = $v \cdot u = v^Tu = <v, u>$ in form.
Using the projection of $v_i$ onto the hyperplane $H$ gives us an orthogonal vector $y_i$ that minimizes the distance from $H$ to $v_i$.
Let $y_i=v_i-(v_i \cdot x)x=v_i -(v_i^Tx)x$. Then minimize $\sum_{i=1}^4||y_i||^2$
$$=\sum_{i=1}^4(v_i-(v_i^T x)x)^T(v_i-(v_i^Tx)x)$$
$$=\sum_{i=1}^4(v_{i}^Tv_i - 2(v_i^Tx)(v_i^T x) + (v_i^T x)^2(x^T x)), \text{ noting } x^T x = 1 \text{ here.}$$
$$ =\sum_{i=1}^4 (v_{i}^T v_i - 2(v_i^Tx)^2 + (v_i^Tx)^2) $$
$$ =\sum_{i=1}^4 (v_{i}^T v_i - (v_i^Tx)^2) $$
2. Expand $F(x_1, x_2)$ and rewrite your minimization problem as a maximization one
$$\max_{x_1,x_2} G(x_1, x_2)=\max \sum (v_i^Tx)^2$$
subject to the constraint $x_1^2+x_2^2=1$
3. Recognize $G(x_1, x_2)$ as an object that we have studied in this course. You might want to explicitly write out $G(x_1, x_2)$
$$G(x1,x2)=(v_1^Tx)^2+(v_2^Tx)^2 + (v_3^Tx)^2 + (v_4^Tx)^2$$
=($(1,2)$$\begin{bmatrix} x_{1} \\x_{2} \end{bmatrix}$ + $(2, 1)$$\begin{bmatrix} x_{1} \\x_{2}\end{bmatrix}$ + $(3, 4)$$\begin{bmatrix} x_{1} \\x_{2}\end{bmatrix}$ + $(4, 3)$$\begin{bmatrix} x_{1} \\x_{2}\end{bmatrix})$
$$=(x_1+2x_2)^2 + (2x_1+x_2)^2 + (3x_1 + 4x_2)^2 + (4x_1 + 3x_2)^2$$
$$=(x_1^2 +4x_1x_2+4x_2^2)+(4x_1^2+4x_1x_2+x_2^2)+(9x_1^2+24x_1x_2+16x_2^2)+(16x_1^2+24x_1x_2+9x_2^2)$$
$$=30x_1^2+56x_1x_2+30x_2^2$$
$$= x^T\begin{bmatrix}
30 & 28\\
28 & 30
\end{bmatrix}x$$
4. Write $G(x_1, x_2)$ in a vector-matrix form using ${\bf x}$ and the matrix $M$ that has $v_i$, $i=1,2,3,4$ as its rows
$$\text{ Let } M = \begin{bmatrix}
1 & 2\\
2 & 1\\
3 & 4\\
4 & 3
\end{bmatrix}$$
$$\sum_{i=1}^4(v_i^Tx)^2 = x^TAx, \text{ where $A$ is $M^T M$}$$
$$A = M^T M= \begin{bmatrix}
1 & 2 & 3 & 4\\
2 & 1 & 4 & 3
\end{bmatrix}
\begin{bmatrix}
1 & 2\\
2 & 1\\
3 & 4\\
4 & 3
\end{bmatrix}= \begin{bmatrix}
30 & 28\\
28 & 30
\end{bmatrix}$$
5. Write the constraint $x_1^2+x_2^2=1$ as a product of two vectors.
$$x\cdot x=1\text{ is equivalent to }x^Tx=1\text{ therefore, } x^Tx - 1= 0$$
6. Use Lagrange multiplyers to solve the maximization problem. Google how to differentiate $G(x_1, x_2)$.
$$\text{Let } H(x_1,x_2)= G(x_1,x_2) - \lambda(x^Tx - 1) $$
$$\text{then, } H'(x_1,x_2)=G'(x_1,x_2)$$
$$\text{ Set the partial derivatives to zero as shown, }\frac{\partial G}{\partial x_1} = 0, \frac{\partial G}{\partial x_2} = 0, \frac{\partial G}{\partial \lambda} = 0$$
$$\text{ then we arrive at } 2x^TA - 2 \lambda x^T = 0 \Rightarrow x^TA = \lambda x^T \Rightarrow Ax = \lambda x$$
7. Recognize your solutions as a mathematical object heavily studied in this course.
$$\text{Then, } G(x_1,x_2)=x^TAx=x^T\lambda x=\lambda x^Tx=\lambda$$
Then, the maximum of $G(x_1,x_2)$ is the maximum of the eigenvalues of $A$.
*The biggest eigenvalue is the one we care about
8. Your solution will produce the desired ${\bf x}$. Find a simple matrix multiplication way of obtaining $p_1$, $p_2$, $p_3$, $p_4$.
$$\text{ Find eigenvalues and normalized eigenvectors of }A= M^TM = \begin{bmatrix}
30 & 28\\
28 & 30
\end{bmatrix}$$
$$ \text{det}(A - \lambda I)=0 \implies$$
$$ (30-\lambda)(30-\lambda)- 28^2= 0 \implies$$
$$ \lambda^2-60\lambda+116=0 \implies$$
$$ (\lambda - 58)(\lambda - 2)= 0$$
Then $\lambda_1 = 58$ and $\lambda_2 = 2$
Solving for the eigenvectors gives $v_1 = \begin{bmatrix} 1 \\1 \end{bmatrix}$ and $v_2 = \begin{bmatrix} -1 \\1 \end{bmatrix}$.
Normalizing these eigenvectors gives us a matrix, say $E$ = $\begin{bmatrix}
1/\sqrt{2} & -1/\sqrt{2}\\
1/\sqrt{2} & 1/\sqrt{2}
\end{bmatrix}$. Notice that it is a rotation matrix.
Now we can find the coordinates on our new $x$-axis, say $x'$ by left multiplying our first eigenvector by our matrix $M$ as $Mv_1$
$$\begin{bmatrix}
p_1\\
p_2\\
p_3\\
p_4
\end{bmatrix}=
Mv_1 = \begin{bmatrix}
1 & 2\\
2 & 1\\
3 & 4\\
4 & 3
\end{bmatrix}
\begin{bmatrix}
\frac{1}{\sqrt{2}}\\
\frac{1}{\sqrt{2}}
\end{bmatrix}=
\begin{bmatrix}
\frac{3}{\sqrt{2}}\\
\frac{3}{\sqrt{2}}\\
\frac{7}{\sqrt{2}}\\
\frac{7}{\sqrt{2}}
\end{bmatrix}
$$
### Problem 2
#### Repeat the work (except the predictions part) for the new set of data:
$x=[4.1, -5.7, 1.4 ,-13.4 ,-7.6, -8.2, 5.2, -0.1, -11.6, -0.1]$;
$y=[-0.7, -0.5, 0.9, 0.1, 0.4, 0.8, 0.2,0.6, 0.7, 1.2]$;
Show how to obtain the rotated data shown as red squares in all figs.
This data forms 10 points $v_1=(4.1, -0.7), v_2=(-5.7, -0.5), v_3=(1.4, 0.9), v_4=(-13.4, 0.1), v_5=(-7.6, 0.4),$
$v_6=(-8.2, 0.8), v_7=(5.2, 0.2), v_8=(-0.1, 0.6), v_9=(-11.6, 0.7), \text{ and }v_{10}=(-0.1, 1.2)$.
When plotted in $\mathbb{R}^2$ they appear as in this plot here.

These are all in $2D$. We want to reduce dimensionality to $1D$. The result should be ten points in $\mathbb{R}^1: p_1, \text{ through }p_{10}$.
Set up a minimization problem that uses $v_1, ..., v_{10}$ as knowns and $x$ as an unknown. H is the hyperplane that the ten points are “closest to.” Visually, it is hard to tell where that hyperplane should lie.
When we solve our problem, we will end up with $\min_{x1, x2}f(x_1, x_2)$ subject to the constraint $x_1^2 + x_2^2 = 1$.
- Following the work done in problem 1 we note that using the projection of $v_i$ onto the hyperplane $H$ gives us an orthogonal vector $y_i$ that minimizes the distance from $H$ to $v_i$.
Let $y_i=v_i-(v_i \cdot x)x=v_i -(v_i^Tx)x$. Then minimize $\sum_{i=1}^{10}||y_i||^2$
$$=\sum_{i=1}^{10}(v_i-(v_i^T x)x)^T(v_i-(v_i^Tx)x)$$
$$=\sum_{i=1}^{10}(v_{i}^Tv_i - 2(v_i^Tx)(v_i^T x) + (v_i^T x)^2(x^T x)), \text{ noting } x^T x = 1 \text{ here.}$$
$$=\sum_{i=1}^{10} (v_{i}^T v_i - 2(v_i^Tx)^2 + (v_i^Tx)^2)$$
$$=\sum_{i=1}^{10}(v_{i}^T v_i - (v_i^Tx)^2)$$
- From the previous problem we have $\max_{x_1, x_2}G(x_1, x_2) = \max \sum(v_i^Tx)^2$ subject to the constraint $x_1^2 + x_2^2 = 1$.
- Write the constraint $x_1^2+x_2^2=1$ as a product of two vectors.
$$x\cdot x=1\text{ is equivalent to }x^Tx=1\text{ therefore, } x^Tx - 1= 0$$
- From the expansion of $G(x_1, x_2)$ we have $x^2 \begin{bmatrix} 517.44 & -16.96\\-16.96 & 4.69 \end{bmatrix} x$ which is our quadratic form $517.44x_1^2-33.92x_1x_2+4.69x_2^2$.
- Use Lagrange multiplyers to solve the maximization problem.
$$\text{Let } H(x_1,x_2)= G(x_1,x_2) - \lambda(x^Tx - 1) \text{ then, } H'(x_1,x_2)=G'(x_1,x_2)$$
$$\text{ Set the partial derivatives to zero as shown, }\frac{\partial G}{\partial x_1} = 0, \frac{\partial G}{\partial x_2} = 0, \frac{\partial G}{\partial \lambda} = 0$$
$$\text{ then we arrive at } 2x^TA - 2 \lambda x^T = 0 \Rightarrow x^TA = \lambda x^T \Rightarrow Ax = \lambda x$$
$$\text{Then, } G(x_1,x_2)=x^TAx=x^T\lambda x=\lambda x^Tx=\lambda$$
And the maximum of $G(x_1,x_2)$ is the maximum of the eigenvalues of $A$.
$$\text{ Create a matrix with our data, } N = \begin{bmatrix}
4.1 & -0.7\\
-5.7 & -0.5\\
1.4 & 0.9\\
-13.4 & 0.1\\
-7.6 & 0.4\\
-8.2 & 0.8\\
5.2 & 0.2\\
-0.1 & 0.6\\
-11.6 & 0.7\\
-0.1 & 1.2
\end{bmatrix}$$
$$\text{ Now we also have } N^T = \begin{bmatrix}
4.1 & -5.7 & 1.4 & -13.4 & -7.6 & -8.2 & 5.2 & -0.1 & -11.6 & -0.1\\
-0.7 & -0.5 & 0.9 & 0.1 & 0.4 & 0.8 & 0.2 & 0.6 & 0.7 & 1.2
\end{bmatrix}$$
- Using the form from problem 1. we can create a new symmetric matrix $A$ that we can use to represent our function $G(x_1, x_2)$ as $\sum_{i=1}^{10}(v_i^Tx)^2 = x^TAx = x^T(N^TN)x$.
To find $A$, solve $N^TN$.
$$A = \begin{bmatrix}
4.1 & -5.7 & 1.4 & -13.4 & -7.6 & -8.2 & 5.2 & -0.1 & -11.6 & -0.1\\
-0.7 & -0.5 & 0.9 & 0.1 & 0.4 & 0.8 & 0.2 & 0.6 & 0.7 & 1.2
\end{bmatrix}
\begin{bmatrix}
4.1 & -0.7\\
-5.7 & -0.5\\
1.4 & 0.9\\
-13.4 & 0.1\\
-7.6 & 0.4\\
-8.2 & 0.8\\
5.2 & 0.2\\
-0.1 & 0.6\\
-11.6 & 0.7\\
-0.1 & 1.2
\end{bmatrix}$$
$$A = \begin{bmatrix} 517.44 & -16.96\\-16.96 & 4.69 \end{bmatrix}$$
- Again, following the lead from the problem above, we want to find the eigenvalues and eigenvectors of $A$.
Solving for det$(A-\lambda I) = 0$ we find
$\lambda_1 = 518$ and $\lambda_2 = 4.1296$ and a pair of eigenvectors $v_1 = \begin{bmatrix}0.99945461\\ 0.03302242\end{bmatrix}$ and $v_2 = \begin{bmatrix}-0.3302242\\ 0.99945461\end{bmatrix}$.
Normalizing these gives us basically the same matrix of eigenvectors because the norm of each of the eigenvctors is $0.99999999999$.
So, our matrix of normalized eigenvectors, call them $u_1$ and $u_2$ is $\begin{bmatrix} 0.99945461 & -0.03302242\\0.03302242 & 0.99945461 \end{bmatrix}$.
- Note that this matrix is a rotation matrix that we can use to find the points in $\mathbb{R}^1$ that lie on a new axis. The first first principal eigenvector can be left multiplied by the matrix $N$.
$$N*u_1 = \begin{bmatrix}
4.1 & -0.7\\
-5.7 & -0.5\\
1.4 & 0.9\\
-13.4 & 0.1\\
-7.6 & 0.4\\
-8.2 & 0.8\\
5.2 & 0.2\\
-0.1 & 0.6\\
-11.6 & 0.7\\
-0.1 & 1.2
\end{bmatrix}
\begin{bmatrix} 0.99945461\\ 0.03302242 \end{bmatrix}
$$
$= 4.07464821, -5.71340249, 1.42895663, -13.38938955, -7.58264608, -8.16910988, 5.20376846, -0.08013201, -11.5705578, -0.06031856$
We can see from the plot below that the data pairs we started with in $\mathbb{R}^2$ (in red) have been translated onto a new axis in $\mathbb{R}^1$ (in blue).
