# Least Square Solutions
## Homogeneous Systems of Linear Equations
In the homogeneous system of linear equations, the constant term in every equation is equal to 0.
Solution can be divided into the following steps:
1. A homogenous system has the form:
$$
Ax = 0
$$
2. We need to impose some constraint to avoid trivial solution ($x=0$). Constraints may be like:
$$
\Vert x \Vert^{2} = 1
$$
3. Our aim is to make $Ax$ as close to 0 as possible with the constraint. This can be converted into an optimization problem
$$
\begin{align}
&\min\limits_{x}(\Vert Ax \Vert^{2}) \ \text{such that } \Vert x \Vert^{2} = 1 \\
&\Rightarrow \min\limits_{x}(\Vert x^{T}A^{T} Ax\Vert) \ \text{such that } \Vert x^{T}x \Vert = 1
\end{align}
$$
4. Then we can define a loss function by [Lagrange multiplier](https://www.youtube.com/watch?v=5A39Ht9Wcu0&t=1s):
$$
\mathcal{L}(x, \lambda) = x^{T}A^{T} Ax - \lambda(x^{T}x - 1)
$$
5. To find the minimum of a function subject to the constraint, we can solve the below equation by setting partial derivatives w.r.t $x$ to zero.
$$
\begin{align}
&2A^{T}Ax - 2\lambda x = 0 \\
&\Rightarrow A^{T}Ax = \lambda x
\end{align}
$$
6. By now, we have managed to convert the problem into finding eigenvectors of the covariance matrix of $A$. The solution would be the **eigenvector** associated with the smallest eigenvalue of $A^{T}A$, which is equvalent to the last column of $V$ of SVD ($A=U\Sigma V^{T}$). The detail can be refered to my previous [SVD post](https://hackmd.io/86obEnjVRt-Pd9oRvW4Sfw?view).
## Non-Homogeneous Systems of Linear Equations
==TODO==
## References
- https://cmsc426spring2019.github.io/lectures/week_2/background_estimation.pdf
- https://textbooks.math.gatech.edu/ila/least-squares.html#:~:text=So%20a%20least%2Dsquares%20solution,difference%20b%20%E2%88%92%20Ax%20is%20minimized.