--- tags: Academic --- # CV HW 2: Decompose Projection ## Problem 1 ### Find P A pair of corresponding point in homogenous coordinate satisfies: $$ \newcommand{\norm}[1]{\lVert #1 \rVert} \newcommand{\bb}[1]{\mathbf{ #1 }} \begin{align} \begin{pmatrix} x \\ y \\ 1 \end{pmatrix} &\sim \bb{P} \begin{pmatrix} X \\ Y \\ Z \\ 1 \end{pmatrix} \end{align} $$ Where $\sim$ means equality up to scale. We can derive two equations: $$ \begin{align} x = \frac {p_{11} X + p_{12} Y + p_{13} Z + P_{14}} {p_{31} X + p_{32} Y + p_{33} Z + P_{34}} , y = \frac {p_{21} X + p_{22} Y + p_{23} Z + P_{24}} {p_{31} X + p_{32} Y + p_{33} Z + P_{34}} \end{align} $$ Rearrange it in matrix form: $$ \begin{pmatrix} X & Y & Z & 1 & 0 & 0 & 0 & 0 & -xX & -xY & -xZ & -x \\ 0 & 0 & 0 & 0 & X & Y & Z & 1 & -yX & -yY & -yZ & -y \\ &&&&&&\vdots\\ &&&&&&\vdots\\ &&&&&&\vdots\\ \end{pmatrix} \begin{pmatrix} p_{11} \\ p_{12} \\ p_{13} \\ \vdots \\ p_{34} \end{pmatrix} = \bb{0} $$ It is a homogenious system $\bb{A}\bb{p} = \bb{0}$. Since there are 12 unknows, but the actual degree of freedom of $\bb{p}$ is 11 (5 in $\bb{K}$, 3 in $\bb{R}$, 3 in $\bb{T}$), we add one constraint to the system: $\norm{\bb{p}} = 1$. Usually more than 6 correspondences are given, the system doesn't have an unique solution. Instead, we try to find the $\bb{p}$ that minimize $\norm{\bb{Ap } - \bb{0}}$. The solution is the eigenvector of $\bb{A^T A}$ which has minimum eigenvalue. The proof is given in Appendix. Another way to solve this system is to use least square method. However the unit norm constraint cannot be expressed in our system, we use another constraint instead, $$ p_{11} + p_{12} + \ldots + p_{34} = 1 $$ After adding constraint, the system becomes: $$ \begin{pmatrix} X & Y & Z & 1 & 0 & 0 & 0 & 0 & -xX & -xY & -xZ & -x \\ 0 & 0 & 0 & 0 & X & Y & Z & 1 & -yX & -yY & -yZ & -y \\ & & & & & &\vdots\\ & & & & & &\vdots\\ 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 \\ \end{pmatrix} \begin{pmatrix} p_{11} \\ p_{12} \\ p_{13} \\ \vdots \\ p_{34} \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ 0 \\ \vdots \\ 1 \end{pmatrix} $$ In fact, any constraint, say $p_{34} = 1$, that reduces degree of freedom should be fine. Using $p_{34} = 1$, the system we get is: $$ \begin{pmatrix} X & Y & Z & 1 & 0 & 0 & 0 & 0 & -xX & -xY & -xZ \\ 0 & 0 & 0 & 0 & X & Y & Z & 1 & -yX & -yY & -yZ \\ &&&&&&\vdots\\ &&&&&&\vdots\\ &&&&&&\vdots\\ \end{pmatrix} \begin{pmatrix} p_{11} \\ p_{12} \\ p_{13} \\ \vdots \\ p_{33} \end{pmatrix} = \begin{pmatrix} x \\ y \\ \vdots \\ \vdots \\ \vdots \end{pmatrix} $$ In this two non-homogeneous system, the objective $\min_{\bb{p}} \norm{\bb{Ap} - \bb{b}}$ has solution $\bb{A}^{†}\bb{b}$ where $†$ is Moore–Penrose pseudoinverse. ### K, R, T $$ \begin{align} \bb{P} &= \bb{K} [\bb{R} \mid \bb{T}] = [\bb{KR} \mid \bb{KT}] \\ &= [\bb{P_{3\times3}} \mid \bb{P_4}] = [\bb{M} \mid \bb{P_4}] \end{align} $$ That is, $$ \begin{align} \bb{M} &= \bb{KR} \\ \bb{P_4} &= \bb{KT} \end{align} $$ We know $\bb{K}$ is upper triangular, $\bb{R}$ is orthogonal, we can utilize **RQ-decomposition** to find $\bb{K}, \bb{R}$: $$ \begin{align} \bb{M} &= \bb{\hat R} \bb{\hat Q} = \bb{KR} \\ \therefore \bb{K} &= \bb{\hat R}\\ \bb{R} &= \bb{\hat Q} \end{align} $$ We can use **QR-decomposition** in case you are not able to access RQ-decomposition function (like in matlab). Decomposite $\bb{M^{-1}}$ instead, since the inverse of an upper trigular matrix is upper triangular and the inverse of orthogonal maxtrix is also orthogonal: $$ \begin{align} \bb{M^{-1}} &= \bb{\dot Q \dot R} \\ \bb{M} &= \bb{(\dot Q \dot R)^{-1}} = \bb{\dot{R}^{-1}\dot{Q}^{-1}} = \bb{KR} \\ \therefore \bb{K} &= \bb{\dot{R}^{-1}}\\ \bb{R} &= \bb{\dot{Q}^{-1}} \end{align} $$ And $\bb{T}$ is simply: $$ \bb{T} = \bb{K^{-1} P_4} $$ ### Normalize Notice that there are some properties that $\bb{K}, \bb{R}, \bb{T}$ must satisfy. In some cases, the $\bb{K}, \bb{R}, \bb{T}$ we obtained from previous step doesn't have these properties. For example, the focal lengths in $K$ can have opposite sign which is wrong obviously. The reason is due to the [opposite direction](http://ksimek.github.io/2012/08/14/decompose/) between the 2D coordinate system and 3D coordinate system. We need to **normalize** the matrices. #### $\bb{K}$ should have positive diagonal To negate the possible nagative value in $\bb{K}$'s diagonal, we insert a matrix and its inverse after RQ-decompositing $\bb{M}$: $$ \begin{align} \bb{M} &= \bb{KR} = \bb{\hat R \hat Q} = \bb{(\hat R D) (D^{-1} \hat Q)}\\ \end{align} $$ where $\bb{D}$ is a diagonal (0, 1)-matrix used for normalization: $$ \bb{D} = \begin{pmatrix} sign({\bb{\hat R}}_{1,1}) & 0 & 0 \\ 0 & sign({\bb{\hat R}}_{2,2}) & 0 \\ 0 & 0 & sign({\bb{\hat R}}_{3,3}) \end{pmatrix} $$ Since $\bb{D}^{-1} = \bb{D}$, we get: $$ \begin{align} \bb{K} &= \bb{\hat R D} \\ \bb{R} &= \bb{D \hat Q} \end{align} $$ #### $\bb{K_{3, 3}}$ should be $1$ Divide all the elements in $\bb{K}$ by $\bb{K_{3,3}}$, numpy code is: ```python K = K / K[-1, -1] ``` Be aware that after this operation, $\bb{K}[\bb{R} | \bb{T}]$ is no longer equals to $\bb{P}$, but using $\bb{K}[\bb{R} | \bb{T}]$ as projection matrix has same result as using $\bb{P}$. How can this be possible? The reason is that we are operated in **homogeneous** coordinate system. Scalar multiplication in homogeneous coordinate has no effect. $$ w \begin{pmatrix} x \\ y \\ 1 \end{pmatrix} \sim \begin{pmatrix} wx \\ wy \\ w \end{pmatrix} \sim \begin{pmatrix} (wx)/w \\ (wy)/y \\ (w)/w \end{pmatrix} \sim \begin{pmatrix} x \\ y \\ 1 \end{pmatrix} $$ Like 2D point $(2, 2)$ in Cartesian coordinate can be written as $(2, 2, 1)$ or $(4, 4, 2)$ in homogeneous coordinate. Notice that we are using $\sim$ instead of $=$, where $\sim$ means equality up to scale. #### $\bb{R}$ should have determinant 1 Rotation Matrix $\bb{R}$ shoudl have determinant $1$, since: 1. Rotation Matrix is orthogonal, thus $det(\bb{R}) = +1 \text{ or } -1$ 2. Rotating any angle can be seen as rotating half of the angle two times $\bb{R}_{\theta} = \bb{R}_{\theta / 2} \bb{R}_{\theta / 2}$, thus $det(\bb{R}_\theta) = det(\bb{R}_{\theta / 2})^2 \ge 0$ So $det(\bb{R}) = 1$. But the $\bb{R}$ we obtained from RQ-decomposition only guarantee its orthogonality Can we simply negate $R$ when $det(\bb{R}) = -1$? No, because $\bb{R}$ is not a term in our projection: $$ \begin{align} \bb{x}_{3 \times 1}^{2D} &\sim \bb{P} \bb{X}_{4 \times 1}^{3D} \\ &\sim \bb{K} [\bb{R} \mid \bb{T}] \bb{X}_{4 \times 1}^{3D} \end{align} $$ We should negate $[\bb{R} \mid \bb{T}]$, or we can negate $\bb{P}$ in the beginning: We chech whether $\bb{P_{3 \times 3}}$ has positive determinant. If not, negate $\bb{P}$. The reason we can check $det(\bb{P_{3 \times 3}})$ instead of $det(\bb{R})$ is that $det(\bb{P_{3 \times 3}})$ has same sign has $det(\bb{R})$: $$ \begin{align} det(\bb{P_{3 \times 3}}) = det(\bb{M}) = det(\bb{K}) det(\bb{R}) \ \ where\ \ det(\bb{K}) > 0 \end{align} $$ ### Full Algorithm 1. Compute $\bb{P}$ 2. Negate $\bb{P}$ if $\bb{P_{3 \times 3}}$ has negative determinant 3. $\bb{K} = \bb{\hat R D}, \bb{R} = \bb{D \hat Q}$ where $\bb{M} = \bb{\hat R \hat Q}$ 4. $\bb{T} = \bb{K^{-1} P_4}$ 5. Scale $\bb{K}$ such that $\bb{K_{3, 3}} = 1$ ### Projection \begin{align} \bb{x}_{3 \times N}^{2D} &\sim \bb{P} \bb{X}_{4 \times N}^{3D} \\ &\sim \bb{K} [\bb{R} \mid \bb{T}] \bb{X}_{4 \times N}^{3D} \end{align} ## Problem 2 Similar to finding $\bb{P}$, a correspondence of points $\bb{x_{src}, x_{dst}}$ satisfies: $$ \begin{align} \begin{pmatrix} x_{dst }\\ y_{dst} \\ 1 \end{pmatrix} &\sim \begin{pmatrix} h_{11} & h_{12} & h_{13} \\ h_{21} & h_{22} & h_{23} \\ h_{31} & h_{32} & 1 \end{pmatrix} \begin{pmatrix} x_{src} \\ y_{src} \\ 1 \end{pmatrix} \end{align} $$ Expand it, we get $$ \begin{align} x_{dst} = \frac {h_{11} x_{src} + h_{12} y_{src} + h_{13}} {h_{31} x_{src} + h_{32} y_{src} + 1} , y_{dst} = \frac {h_{21} x_{src} + h_{22} y_{src} + h_{23}} {h_{31} x_{src} + h_{32} y_{src} + 1} \end{align} $$ That is, $$ \begin{pmatrix} x_{src} & y_{src} & 1 & 0 & 0 & 0 & -x_{dst} x_{src} & -x_{dst} y_{src} \\ 0 & 0 & 0 & x_{src} & y_{src} & 1 & -y_{dst} x_{src} & -y_{dst} y_{src} \\ &&&&\vdots\\ &&&&\vdots\\ &&&&\vdots\\ \end{pmatrix} \begin{pmatrix} h_{11} \\ h_{12} \\ h_{13} \\ \vdots \\ h_{32} \end{pmatrix} = \begin{pmatrix} x_{dst} \\ y_{dst} \\ \vdots \\ \vdots \\ \vdots \end{pmatrix} $$ Easily solved as $\bb{A}^{†}\bb{b}$ where $†$ is Moore–Penrose pseudoinverse. ### Forward Warping ### Backward Warping ## Appendix To solve $$ \min_{\norm{\bb{x}} = 1} \norm{\bb{Ax} - \bb{0}} =\min_{\norm{\bb{x}} = 1} \norm{\bb{Ax}} $$ Apply **SVD decomposition** to $\bb{A}$: $$ \begin{align} &\bb{A} = \bb{U \Sigma V^T} \ where \\ &\bb{U} \text{ is orthogonal}, \\ &\bb{\Sigma} \text{ contains singular values of }\bb{A^T A} \text{ on its diagonal, and } \sigma_1 \ge \sigma_2 \ge \ldots \ge \sigma_n \gt 0 \\ &\bb{V}\text{ has eigenvectors } \bb{v_i} \text{ of }\bb{A^T A}\text{ as its column vectors.} \end{align} $$ And using the property of orthogonal matrix: $\norm{\bb{Uv}} = \norm{\bb{v}}$ for any vector $\bb{v}$, we get, $$ \begin{align} &= \min_{\norm{\bb{x}} = 1} \norm{\bb{U \Sigma V^Tx}} \\ &= \min_{\norm{\bb{x}} = 1} \norm{\bb{\Sigma V^T x}} \\ \end{align} $$ By letting $\bb{y} = \bb{V^T x}$, we have $\norm{\bb{x}} = 1 \to \norm{\bb{y}} = 1$. The objective is rewritten as: $$ \begin{align} &= \min_{\norm{\bb{y}} = 1} \norm{\bb{\Sigma y}} \\ &= \min_{\norm{\bb{y}} = 1} \sigma_1^2 y_1^2 + \sigma_2^2 y_2^2 + \ldots + \sigma_n^2 y_n^2 \\ &= \min_{\norm{\bb{y}} = 1} \lambda_1 y_1^2 + \lambda_2 y_2^2 + \ldots + \lambda_n y_n^2 \\ &where \ \ \sigma_1 \ge \sigma_2 \ge \ldots \ge \sigma_n \end{align} $$ Such objective reachs its minimum $\lambda_n$ when: $$ y_1 = y_2 = \ldots = y_{n - 1} = 0, \ y_n = 1 $$ At the same time, $\bb{x}$ can be derived from previous equation: $$ \begin{align} \bb{y} &= \bb{V^T} \bb{x} \\ \bb{x} &= \bb{V} \bb{y} \\ &= \bb{v_1} y_1 + \bb{v_2} y_2 + \ldots + \bb{v_n} y_n \\ &= \bb{v_n} \end{align} $$ In conclusion, $\bb{x}$ is the eigenvector of $\bb{A^T A}$ with minimum eigenvalue.