---
title: "東京大学大学院 情報理工入試過去問(数学)解答・2018年 第一問"
tags: u-tokyo, exam
---
<style>
.markdown-body h1 > span:last-child{
font-size: 18pt;
float: right;
width: 100%;
text-align: right;
border-bottom: 1pt #EEE solid;
padding-bottom: 5pt;
margin-bottom: 5pt
}
.markdown-body h1{
border: none;
}
.markdown-body h2{
padding-top: 8px;
padding-left: 5px;
color: black;
background-color: #F2D7D9;
z-index: 1;
}
.markdown-body h3{
padding: 5px 0 5px 5px;
background-color: #D3CEDF
}
.markdown-body h4{
position: relative;
line-height: 28pt;
padding-left: 5px;
color: #FFFFFF;
font-size: 13pt;
background-color: #9CB4CC;
}
/* Pallete #F2D7D9 #D3CEDF #9CB4CC #748DA6 */
</style>
<!-- Probability & Statistics Related -->
$$
\newcommand{\prob}[1]{\mathrm{P}(#1)}
\newcommand{\probsub}[2]{\mathrm{P}_{#1}(#2)}
\newcommand{\func}[2]{\mathrm{#1}(#2)}
\newcommand{\funcsub}[3]{\mathrm{#1}_{#2}(#3)}
\newcommand{\excp}[1]{\mathrm{E}(#1)}
$$
<!-- Linear Algebra Related -->
$$
\require{boldsymbol}
\newcommand{\vec}[1]{\boldsymbol{#1}}
\newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}}
\newcommand{\rank}[1]{\mathrm{rank}(#1)}
\newcommand{\nobj}[3]{#1_1{#3\;} #1_2{#3} \;\ldots\; {#3} #1_#2}
\newcommand{\nobjsuf}[4]{#1_1#3{#4\;} #1_2#3{#4\;} \;\ldots\; {#4\;} #1_#2#3}
\newcommand{\norm}[1]{\| #1 \|}
\newcommand{\trps}[1]{#1^{\moveleft{1.5px}{\raise{2px}{\mathrm{T}}}}}
\newcommand{\inv}[1]{#1^{-1}}
$$
# 東京大学大学院 情報理工入試過去問(数学)<br><span style="float: right;">解答・2018年 第一問</span>
+ こちらの解答は非公式に作成されており、大学院側に認められたものではありません。正確性についての保証は致しかねます。
+ 問題本文は研究科のウェブサイトから見ることができます。
+ 満足出来た解答でしたらコメントしていただけると幸いです。
## 2020年 第三問
### (1) Let $A = \bmat{ 1 & 0 & -1 \\ 1 & 1 & 0 \\ 0 & 1 & 1 }, \vec{b} = \bmat{ 2 \\ 4 \\ 2 }, \overline{A} = \bmat{ A \mid \vec{b} } = \bmat{ | & | & | & | \\ a_1 & a_2 & a_3 & a_4 \\ | & | & | & | }$.
#### (i) Find the maximum number of linear independent vectors among $\vec{a}_1 \sim \vec{a}_3$.
$\vec{a}_1 + \vec{a}_2 = \vec{a}_3$, and $\vec{a}_1 \geq k\vec{a}_2$ for any $k$. So the max number of independent vectors is $2$.
#### (ii) Find $x_1, x_2$ for $\vec{a}_4 = x_1\vec{a}_1 + x_2\vec{a}_2 + \vec{a}_3$.
$$
\vec{a}_4 = 3\vec{a}_1 + 1\vec{a}_2 + \vec{a}_3
$$
#### (iii) Find the maximum number of linear independent vectors among $\vec{a}_1 \sim \vec{a}_4$.
Since $\vec{a}_4$ can be decomposited using $\vec{a}_1, \vec{a}_2$ and $\vec{a}_3$, its linearly dependent to them. So the maximum number of linearly independent vectors among them is still $2$.
### (2) Prove that the solution exists for $A\vec{x}= \vec{b}$ when $\rank{\overline{A}} = \rank{A}$, where $A \in \mathcal{R}^{m\times n}, \vec{b} \in \mathcal{R}^m$
Let $A = \bmat{ | & | & \cdots & | \\ \vec{a}_1 & \vec{a}_2 & \cdots & \vec{a}_m \\ | & | & \cdots & | }$ and $b = \vec{a}_{m + 1}$.
If $\rank{\overline{A}} = \rank{A}$,
then $\vec{a}_{m + 1}$ must be linearly dependent to $\vec{a}_1, \vec{a}_2, \ldots \; \vec{a}_m$,
meaning that there exists a combination of $x_1, x_2, \ldots x_m$ so that
$x_1\vec{a}_1 + x_2\vec{a}_2 + \ldots + x_m\vec{a}_m = \vec{a}_{m + 1}$, or $\bmat{ | & | & \cdots & | \\ \vec{a}_1 & \vec{a}_2 & \cdots & \vec{a}_m \\ | & | & \cdots & | }\bmat{x_1 \\ x_2 \\ \vdots \\ x_m} = \vec{a}_{m + 1}$, $\vec{x} = \bmat{x_1 \\ x_2 \\ \vdots \\ x_m}$.
### (3) Find $x$ that minimize the value of $\|{\vec{b} - A\vec{x}}\|^2$, <br>where $A \in \mathcal{R}^{m \times n}$, $m > n, \rank{A} = n$ and $\rank{\overline{A}} > \rank{A}$.
Since $\rank{\overline{A}} > \rank{A}$, $\vec{b}$ is linearly independent to $\nobj{\vec{a}}{m}{,}$, meaning no $\vec{x}$ exists so that $A\vec{x} = \vec{b}$, a.k.a, $\vec{b}$ is not in the column space of $A$.
To minimize $\| \vec{b} - A\vec{x} \|^2$, let $\hat{\vec{x}}$ be said $\vec{x}$.
+ $A\hat{x}$ is in the column space of $A$
+ Since $\hat{x}$ is defined to minimize $\|\vec{b} - A\hat{\vec{x}}\|^2$, $\vec{b} - A\hat{\vec{x}}$ has to be orthogonal to the column space of $A$
Thus we can know that for all column vectors $\nobj{\vec{a}}{m}{,}$, $\vec{a}_i^{\mathrm{T}}(\vec{b} - A\hat{\vec{x}})$, or we can say:
$$
\begin{align}
\trps{A}{A}(\vec{b} - A\hat{\vec{x}}) = 0 &\Rightarrow \trps{A}\vec{b} = \hat{\vec{x}} \\[1ex]
&\Rightarrow (\trps{A}A)^{-1}\trps{A}\vec{b} = \hat{\vec{x}}
\end{align}
$$
> + Since $A$ has indepenent columns, we know $\trps{A}A$ is invertible.
### (4) Using Lagrange multipliers along $A\vec{x} = \vec{b}$, find $\vec{x}$ that minimizes $\| \vec{x} \|^2$, <br> where $A \in \mathcal{R}^{m \times n}$, $m < n$, and $\rank{A} = m$.
Since there are $m$ different linear equations as constraints, we let
+ $\nobj{\lambda}{m}{, }$ be the Lagrange multipliers
+ $f(\nobj{x}{n}{,}) = \|\vec{x}\|^2 = \nobjsuf{x}{n}{^2}{+ }$
+ The contraints are $g_i = a_{i, 1}x_1 + a_{i, 2}x_2 + \ldots + a_{i, n}x_n$ for $i = 1$ to $m$
And we have the equation:
$$
\begin{align}
&\color{transparent}{\Rightarrow} \nabla f(\nobj{x}{n}{,}) = \displaystyle \sum^{m}_{i = 1}\lambda_i\nabla g_i(\nobj{x}{n}{,}) \\[2ex]
&\Rightarrow \begin{cases}
2x_1 &= \lambda_1a_{1, 1} + \lambda_2a_{2, 1} + \ldots + \lambda_ma_{m, 1} \\
2x_2 &= \lambda_1a_{1, 2} + \lambda_2a_{2, 2} + \ldots + \lambda_ma_{m, 2} \\
&\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\vdots \\
2x_n &= \lambda_1a_{1, n} + \lambda_2a_{2, n} + \ldots + \lambda_ma_{m, n}
\end{cases}, \;\;\text{ Let } \vec{\lambda} = \bmat{\lambda_1 \\ \lambda_2 \\ \vdots \\ \lambda_m} \\[2ex]
&\Rightarrow 2\vec{x} = \trps{A}\vec{\lambda} \Rightarrow \vec{x} = \frac{\trps{A}\vec{\lambda}}{2}
\end{align}
$$
Substitute into $A\vec{x} = \vec{b}$, we get:
$$
\begin{align}
&\color{transparent}{\Rightarrow} \frac{1}{2} A\trps{A}\lambda = b \\[1ex]
&\Rightarrow A\trps{A}\lambda = 2b \\[1ex]
&\Rightarrow \inv{(A\trps{A})}(A\trps{A})\lambda = 2\inv{(A\trps{A})}b \\[1ex]
\end{align}
$$
> + Since $\trps{A}$ has indpendent columns, we know $A\trps{A}$ is invertible.
Again substitute back to $2\vec{x} = \trps{A}\vec{\lambda}$, we result in:
$$
2\vec{x} = \trps{A}\vec{\lambda} = \trps{A}2\inv{(A\trps{A})}\vec{b} \\
\Rightarrow \vec{x} = \trps{A}\inv{(A\trps{A})}\vec{b}
$$
### (5) Show that there exists unique $P = \mathcal{R}^{n \times m}$ satisfing $\begin{cases} APA = A \\ PAP = P \\ \trps{(AP)} = AP \\ \trps{(PA)} = PA \end{cases}$ <br> where $A \in \mathcal{R}^{m \times n}$.
Since they are orthogonal matrices, the inverses exists, and their transpose is their inverse.
Thus we can say:
$$
\begin{align}
APA = A &\Rightarrow U\Sigma\trps{V}PU\Sigma\trps{V} = U\Sigma\trps{V} \\
&\Rightarrow \Sigma\trps{V}PU\Sigma = \Sigma \\
&\Rightarrow \trps{V}PU = \inv{\Sigma}\\
&\Rightarrow P = V\inv{\Sigma}\trps{U}
\end{align}
$$
Assume two matrices $X, Y$ satisfying the equations. Then, we know:
$$
\begin{align}
X &= XAX &\tiny\text{ (From equation 2) } \\
&= X\trps{(AX)} &\tiny\text{ (From equation 3) } \\
&= X\trps{X}\trps{A} = X\trps{X}\trps{(AYA)} &\tiny\text{ (From equation 1) } \\
&= X\trps{X}\trps{A}\trps{(AY)} = X\trps{(AX)}\trps{AY} = (XAX)AY \\
&= XAY = \trps{(XA)}Y &\tiny\text{ (From equation 4) } \\
&= \trps{(XA)}YAY &\tiny\text{ (From equation 2) } \\
&= \trps{A}\trps{X}\trps{(YA)}Y &\tiny\text{ (From equation 4) } \\
&= \trps{A}\trps{X}\trps{A}\trps{Y}Y = \trps{(XA)}\trps{A}\trps{Y}Y = \trps{[A(XA)]}\trps{Y}Y \\
&= \trps{(AXA)}\trps{Y}Y = \trps{A}\trps{Y}Y &\tiny\text{ (From equation 1) } \\
&= \trps{(YA)}Y = YAY &\tiny\text{ (From equation 4) } \\
&= Y &\tiny\text{ (From equation 2) } \\
\end{align}
$$
Thus such $P$ must be unique.
### (6) Show that both $x$ described in (3) and (4) satisfies $\vec{x} = P\vec{b}$.
From equation 1 and 3, we can know that
$$
A = APA = \trps{(AP)}A = \trps{P}\trps{A}A
$$
If $A$, like in question (3), has independent columns, then $\trps{A}A$ is invertible, se we futhur get:
$$
\begin{align}
\trps{P}\trps{A}A = A &\Rightarrow \trps{P} = A\inv{(\trps{A}A)} \\
&\Rightarrow P = \trps{\big[ A\inv{(\trps{A}A)} \big]} = \trps{(\inv{(\trps{A}A)})}\trps{A}
\end{align}
$$
Since for any matrix $A$, $\trps{A}A$ is symmetric, we know $\inv{(\trps{A}A)}$ is also symmetric, thus:
$$
P = \inv{(\trps{A}A)}\trps{A}
$$
With the result in (3) where $\vec{x} = \inv{(\trps{A}A)}\trps{A}\vec{b}$, we can conclude $\vec{x} = P\vec{b}$.
----
From equation 1 and 4, we can know that
$$
A = APA = A\trps{(PA)} = A\trps{A}\trps{P}
$$
If $A$, like in question (3), has independent rows(a.k.a., $\trps{A}$ hs independent columns), then $A\trps{A}$ is invertible, se we futhur get:
$$
\begin{align}
A\trps{A}\trps{P} = A &\Rightarrow \trps{P} = \inv{(A\trps{A})}A \\
&\Rightarrow P = \trps{\big[ \inv{(A\trps{A})}A \big]} = \trps{A}\trps{\big[ \inv{(A\trps{A})} \big]}
\end{align}
$$
Since for any matrix $A$, $A\trps{A}$ is symmetric, we know $\inv{(A\trps{A})}$ is also symmetric, thus:
$$
P = \trps{A}\inv{(A\trps{A})}
$$
With the result in (4) where $\vec{x} = \trps{A}\inv{(A\trps{A})}\vec{b}$, we can again conclude $\vec{x} = P\vec{b}$.