---
title: Team_2_251
---
***score 15/15***
The four points are $v_1=(1,2), v_2=(2,1), v_3=(3,4), v_4=(4,3)$. They are in 2D. We want to reduce dimensionality to one. The result should be four points in $\mathbb {R}^1$: $p_1$, $p_2$, $p_3$, $p_4$.
**Predict the answer**
1. By visual inspection find a hyperplane $H$ that the four points are "closest to."
A straight line $y=x$.
2. Find projections of the four points on $H$.
$(1.5, 1.5)$ is the projection of both $v_1=(1,2)$ and $v_2=(2,1)$ onto $H$.
$(3.5, 3.5)$ is the projection of both $v_3=(3,4)$ and $v_4=(4,3)$ onto $H$.
These can be found geometrically using the midpoint formula.
3. Find $p_1$, $p_2$, $p_3$, $p_4$.
$p_1=(\frac{3 \sqrt{2}}{2}), p_2=(\frac{3 \sqrt{2}}{2}), p_3=(\frac{7 \sqrt{2}}{2}), p_4=(\frac{7 \sqrt{2}}{2})$
**Mathematize your approach**
1. $H$ is uniquely defined by a vector ${\bf x}:=(x_1,x_2)$ that we need to find. Let us assume that ${\bf x}$ has unit length. Set up a minimization problem that uses $v_1,v_2,v_3,v_4$ as knowns and ${\bf x}$ as an unknown. This should reflect the fact that $H$ is the hyperplane that the four points are "closest to." You should be able to recognize the mathematical objects that we have studied in this course. Name them. You will end up with
$$\min_{x_1,x_2} F(x_1, x_2)$$
subject to the constraint $x_1^2+x_2^2=1$
Projection of $v_i$ onto ${\bf x}$ is $({\bf x}\cdot v_i){\bf x}$
$$\min_{x_1,x_2} F(x_1, x_2) = \min_{x_1,x_2} \sum_{i=1}^n \|v_i-({\bf x} \cdot v_i){\bf x}\|^2$$
2. Expand $F(x_1, x_2)$ and rewrite your minimization problem as a maximization one
$$\max_{x_1,x_2} G(x_1, x_2)$$
subject to the constraint $x_1^2+x_2^2=1$
$$\min_{x_1,x_2} F(x_1, x_2) = \min_{x_1,x_2} \sum_{i=1}^n \|v_i-({\bf x} \cdot v_i){\bf x}\|^2 =$$
$$\min_{x_1,x_2} \sum_{i=1}^n \|v_i-(x_1v_{i_1}+x_2v_{i_2}){\bf x}\|^2 =$$
$$\min_{x_1,x_2} \sum_{i=1}^n (v_i-({\bf x}\cdot v_i) {\bf x}) \cdot (v_i-(x\cdot v_i){\bf x}) =$$
$$\min_{x_1,x_2} \sum_{i=1}^n ((v_i \cdot v_i) -2({\bf x}\cdot v_i)(v_i \cdot {\bf x})+({\bf x}\cdot v_i)^2({\bf x} \cdot {\bf x})) =$$
$$\min_{x_1,x_2} \sum_{i=1}^n ((v_i \cdot v_i) -2({\bf x}\cdot v_i)^2+({\bf x}\cdot v_i)^2) =$$
$$\min_{x_1,x_2} \sum_{i=1}^n ((v_i \cdot v_i) -({\bf x}\cdot v_i)^2) =$$
$$\max_{x_1,x_2} \sum_{i=1}^n ({\bf x}\cdot v_i)^2 = \max_{x_1,x_2} G(x_1, x_2)$$
3. Recognize $G(x_1, x_2)$ as an object that we have studied in this course. You might want to explicitly write out $G(x_1, x_2)$
$$G(x_1, x_2) = \sum_{i=1}^n ({\bf x}\cdot v_i)^2 =$$
$$\sum_{i=1}^n (x_1v_{i_1} + x_2v_{i_2})^2=$$
$$(x_1v_{1_1} + x_2v_{1_2})^2 + (x_1v_{2_1} + x_2v_{2_2})^2... (x_1v_{n_1} + x_2v_{n_2})^2=$$
This is a Quadratic Form!! The specific Quadratic for for our example is:
$$(1x_1 + 2x_2)^2 + (2x_1 + 1x_2)^2 + (3x_1 + 4x_2)^2+(4x_1 + 3x_2)^2$$
4. Write $G(x_1, x_2)$ in a vector-matrix form using ${\bf x}$ and the matrix $M$ that has $v_i$, $i=1,2,3,4$ as its rows
$$ M = \begin{bmatrix}
1 & 2\\
2 & 1\\
3 & 4\\
4 & 3
\end{bmatrix}$$
$$G({\bf x})={\bf x}^T (M^TM){\bf x}={\bf x}^TA{\bf x}$$
$$(1x_1 + 2x_2)^2 + (2x_1 + 1x_2)^2 + (3x_1 + 4x_2)^2+(4x_1 + 3x_2)^2={\bf x}^T(M^TM){\bf x}$$
5. Write the constraint $x_1^2+x_2^2=1$ as a product of two vectors.
$${\bf x} \cdot {\bf x} = {\bf x}^T{\bf x} = 1$$
6. Use Lagrange multiplyers to solve the maximization problem. Google how to differentiate $G(x_1, x_2)$.
Lagrange function is $L({\bf x},\lambda)=G({\bf x})-\lambda({\bf x}^TI{\bf x}-1)$
Take a derivative with respect to x, and set it equal to zero
$$L({\bf x},\lambda)=G({\bf x})-\lambda({\bf x} ^TI{\bf x}-1) = {\bf x}^T (M^TM){\bf x}-\lambda({\bf x}^TI{\bf x}-1)$$
$$L'({\bf x},\lambda)=2{\bf x}^T(M^TM)-2\lambda {\bf x}^TI=0$$
7. Recognize your solutions as a mathematical object heavily studied in this course.
$$2{\bf x}^T(M^TM)-2\lambda {\bf x}^TI=0$$
$${\bf x}^T(M^TM)-\lambda {\bf x}^T=0$$
$${\bf x}^T(M^TM)=\lambda {\bf x}^T$$
$$[{\bf x}^T(M^TM)]^T=[\lambda {\bf x}^T]^T$$
$$[(M^TM)]^T{\bf x}=\lambda {\bf x}$$
$$(M^TM){\bf x}=\lambda {\bf x}$$
$\lambda$ is an eigenvalue of $M^TM$, and ${\bf x}$ is an eigenvector
$$ M = \begin{bmatrix}
1 & 2\\
2 & 1\\
3 & 4\\
4 & 3
\end{bmatrix}$$
$$M^TM = \begin{bmatrix}
1 & 2 & 3 & 4\\
2 & 1 & 4 & 3
\end{bmatrix}
\begin{bmatrix}
1 & 2\\
2 & 1\\
3 & 4\\
4 & 3
\end{bmatrix}=
\begin{bmatrix}
30 & 28\\
28 & 30
\end{bmatrix}$$
In order to find the eigen values of $M^TM$, we must compute: $$det(M^TM-\lambda I)=0$$
$$(30-\lambda)^2 - 28^2=0$$
$$\lambda= 2, 58$$
Since we are maximizing $G({\bf x})$ we want the maximal value of lambda, thus $\lambda=58$.
In order to find the eigen vectors we must compute:
$$M^TM{\bf x}=\lambda x$$
$$\begin{bmatrix}
30 & 28\\
28 & 30
\end{bmatrix} \begin{bmatrix}
x_1\\
x_2
\end{bmatrix}=58\begin{bmatrix}
x_1\\
x_2
\end{bmatrix}$$
$$30x_1+28x_2=58x_1\\
28x_1+30x_2=58x_2$$
Solving both equations tells us that $x_1=x_2$, so the corresponding eigen vector is $${\bf x}=\begin{bmatrix}
\frac{1}{\sqrt{2}}\\
\frac{1}{\sqrt{2}}
\end{bmatrix}$$
8. Your solution will produce the desired ${\bf x}$. Find a simple matrix multiplication way of obtaining $p_1$, $p_2$, $p_3$, $p_4$.
In order to find $p_i$ we must find the magnitude of the orthogonal projection of $v_i$ onto ${\bf x}$
The projection of $v_i$ onto ${\bf x}$ is $({\bf x}\cdot v_i){\bf x}$ thus $p_i=\|({\bf x}\cdot v_i){\bf x}\|$
$p_1 = \|({\bf x}\cdot v_1){\bf x}\|=\|(\frac{3\sqrt{2}}{2}){\bf x}\|=\|\begin{bmatrix}
\frac{3} {2}\\
\frac{3} {2}
\end{bmatrix}\|=\sqrt{\frac{3} {2}^2 + \frac{3} {2}^2}=\frac{3 \sqrt{2}}{2}$
$p_2 = \|({\bf x}\cdot v_2){\bf x}\|=\|(\frac{3\sqrt{2}}{2}){\bf x}\|=\|\begin{bmatrix}
\frac{3} {2}\\
\frac{3} {2}
\end{bmatrix}\|=\sqrt{\frac{3} {2}^2 + \frac{3} {2}^2}=\frac{3 \sqrt{2}}{2}$
$p_3 = \|({\bf x}\cdot v_3){\bf x}\|=\|(\frac{7\sqrt{2}}{2}){\bf x}\|=\|\begin{bmatrix}
\frac{7} {2}\\
\frac{7} {2}
\end{bmatrix}\|=\sqrt{\frac{7} {2}^2 + \frac{7} {2}^2}=\frac{7 \sqrt{2}}{2}$
$p_4 = \|({\bf x}\cdot v_4){\bf x}\|=\|(\frac{7\sqrt{2}}{2}){\bf x}\|=\|\begin{bmatrix}
\frac{7} {2}\\
\frac{7} {2}
\end{bmatrix}\|=\sqrt{\frac{7} {2}^2 + \frac{7} {2}^2}=\frac{7 \sqrt{2}}{2}$
Since $\|{\bf x}\| = 1$, each $p_i=\|({\bf x}\cdot v_i){\bf x}\|=({\bf x}\cdot v_i)\|{\bf x}\|=({\bf x}\cdot v_i)$. It can be seen above that the magnitude of each orthogonoal projection of $v_i$ onto ${\bf x}$ is equal to the dot product of $v_i$ and ${\bf x}$.
9. (Extra Credit) Find a way (by rotation) to compute the points denoted by the red squares in data_toy.pdf. This does not affect any computations above, and is simply extra work to provide visual help.