# Max-cut
###### tags: `寒假課程`
Let $G:G(V,E)$, $|V|=n$
Goal : Find $S \subset V$, $|E(S, \bar S)|=|\{\{i,j\}:i \in S, j \in \bar S\}|$ 為 maximized
* Max-cut為NP-hard
* Min-cut有polynomial time的演算法
## Thm1
> 為了跟SoS扯上關係,不然其實以thm2的結果就可以了
Let $G = G(V,E), \ opt(G) = maxcut(G), \ f_G(x)=\frac{1}{4} \sum\limits_{\{i,j\} \in E}{(x_i - x_j)^2}$
$\forall x \in \{-1, 1\}^n, \ \max\limits_{x \in \{-1, 1\}^n}{f_G(x) \equiv maxcut(G)}=opt(G)$
$\Rightarrow \forall \ G, \ \frac{opt(G)}{0.878}-f_G(x)$ 有一個 deg = 2 的SoS cert $\Rightarrow f_G \leq \frac{opt(G)}{0.878}$
### proof thm1 (by thm2)
Suppose $\frac{opt(G)}{0.878}-f_G(x)$ is not SoS
$\Rightarrow \exists \ \mu :deg=2 \ \ni \ \tilde E_{\mu}(\frac{opt(G)}{0.878} - f_G) < 0$
$\Rightarrow \exists \ \mu' \ \ni \ E_{\mu'}f_G \geq 0.878 \ \tilde E_{\mu}f_G > opt(G) \ (\because thm2)$
$\Rightarrow E_{\mu'}f_G > opt(G) \ (\because contradiction)$
#### thm2 $\equiv$ rounding的動作
> Pseudo Dist(特殊的SDP):可以找到SoS的最佳解,但沒有答案卻要找到真正的值,所以要讓轉換過程有效率,因此要證明有bound
Goal : Pseudo Dist $\rightarrow$ Dist
Use Pseudo Dist $\mu$, we have $\tilde E_{\mu}f_G(x) = opt_{SoS}$
$\Rightarrow$ We can find $\mu', \ E_{\mu'}f_G(x) \geq 0.878 \ opt_{SoS}$
因此,可找到 $\mu \rightarrow \mu'$ effieient $\Rightarrow$ 給予我們演算法趨近 Maxcut
## Thm2
Let $\mu$ be a deg = 2 Pseudo Dist on $\{-1,1\}^n$
$\Rightarrow$ there is Actual Dist $\mu' \ \ni E_{\mu'}f_G(x) \geq 0.878 \cdot \tilde E_{\mu}f_G(x)$
### proof thm2
Let Pseudo Dist over $\{-1,1\}^n \rightarrow \mu$, and Actual Dist $\rightarrow \mu'$
$\Rightarrow \mu \rightarrow \mu' \ \ni \ E_{\mu'}(1,x)^{\otimes 2} = \tilde E_{\mu}(1,x)^{\otimes 2}$
因為沒辦法直接轉,所以要rounding的動作
## Lemma (Gaussian Sampling)
$\forall$ deg = 2 Pseudo Dist $\mu$
$\Rightarrow \exists$ actual dist over $R^n$ 有相同的 1st and 2nd moment
### proof
* 1st moment:$\tilde E_{\mu}x$
* 2nd moment:$\tilde E_{\mu}xx^T \geq 0 \Rightarrow \ \tilde E_{\mu}(x-\tilde E_{\mu}x)(x-\tilde E_{\mu}x)^T \geq 0$
$\Rightarrow \tilde E_{\mu}1 = 1, \ \tilde E_{\mu}(1,x)(1,x)^T \geq 0$
* claim:
$\forall$ vector m, 任何 psd matrix $\Sigma \ ( \ e.q. \tilde E_{\mu}(1,x)(1,x)^T \ )$
$\Rightarrow \exists$ gaussian dist 有 mean $\mu$ and covariance $\Sigma$
> psd切出來的一樣是psd,且任何psd都可以找到一個dist
:::info
#### 補充解釋 Pseudo Dist
n = 3, d = 2
$\tilde E_{\mu}x_1 = ?$
$\tilde E_{\mu}x_2 = ?$
$\tilde E_{\mu}x_3 = ?$
$\tilde E_{\mu}{x_1}^2 = ?$
$\tilde E_{\mu}{x_2}^2 = ?$
$\tilde E_{\mu}{x_3}^2 = ?$
$\tilde E_{\mu}{x_1x_2} = ?$
$\tilde E_{\mu}{x_1x_3} = ?$
$\tilde E_{\mu}{x_2x_3} = ?$
$\Rightarrow \tilde E_{\mu}{\phi(x)}^2 \geq 0, if \ deg(\phi) \leq 1$
> 不是機率分佈,只是定義這些monomial值
:::
假設 $\mu$ 為actual dist
$\Rightarrow x \sim \mu$, $x$ or $-x$ uniformly (symmetrization)
> uniformly 代表 mean = 0
Q:$\tilde Exx^T$ 可以轉換成 $\tilde E_uf_G(x)$嗎?
假設 Pseudo Dist 有 mean = 0 and 2nd moments = $\tilde E_uxx^T$
$\Rightarrow$
\begin{aligned}
f_G(x) &= \frac{1}{4} \sum\limits_{\{i,j\} \in E} (x_i - x_j)^2 \\
& = \frac{1}{4} \sum\limits_{\{i,j\} \in E}{x_i}^2 + {x_j}^2 - 2x_ix_j \\
& =\frac{1}{4} \sum\limits_{\{i,j\} \in E}2 - 2x_ix_j
\end{aligned}
$\Rightarrow \tilde E_uf_G(x) = \frac{1}{4}\sum\limits_{\{i,j\} \in E}(2 - 2 \tilde E_ux_ix_j)$
WLOG : 假設 $\tilde E_ux = 0$
1. $g \sim N(0, \ \tilde E_uxx^T)$
2. $\hat x_i = sign(g_i), \ \hat x \in \{-1,1\}^n$
Let $u'$ : dist on $\hat x$
* claim 1 : $E_{u'}f_G(x) \geq 0.878 \ \tilde E_uf_G(x)$
\begin{aligned}
& E_{u'}f_G(x) = \frac{1}{4} \ E_{u'}(\hat x_i - \hat x_j)^2 = \frac{1}{4} \ Pr[\hat x_i \neq \hat x_j] \\
& \tilde E_uf_G(x) = \frac{1}{4} \ \tilde E_{u}(x_i - x_j)^2
\end{aligned}
* claim 2 : Sheppard Lemma
\begin{aligned}
& Pr[sign(g_i) \neq sign(g_j)] \\
& E[ \ (g_i - g_j)^2 \ ] \geq \frac{2 \ arccos(\rho)}{\pi \ (1 - \rho)} \ where \ \rho =\tilde E_ux_ix_j
\end{aligned}
> 很多東西都是用gaussian來用
> 因為gaussian有很多現成工具,讓學生來連結,以後可以用
## Quadratic program
* OPT:
$max \sum\limits_{\{i,j\} \in E}\frac{(x_i - x_j)^2}{4} = \sum\limits_{\{i,j\} \in E}\frac{1 - x_i x_j}{2} \ s.t. \ {x_i}^2 = 1, \ \forall \ x_i = \{-1, 1\}$
* SDP:
$max \sum\limits_{\{i,j\} \in E}\frac{(1 - z_{ij})}{2} \ s.t. \ z \geq 0 \ (PSD) \Rightarrow \ z_{ii} = 1 \ (unit \ vector)$
> 如果對角線都是1
> 需要的條件通常只要feasible solution(有效解,但不一定是最佳解)
$\Rightarrow SDP \geq OPT$
Ex:
$Let \ x = (x_1, x_2, x_3)^T$
$\Rightarrow$
$$
xx^T =
\begin{pmatrix}
{x_{1}}^2 & x_{1}x_{2} & x_{1}x_{3}\\
x_{1}x_{2} & {x_{2}}^2 & x_{2}x_{3}\\
x_{1}x_{3} & x_{2}x_{3} & {x_{3}}^2
\end{pmatrix}
$$
> 對角線都是1,因為我們假設對角線的平方都是1
> 所以SDP解集會比較大,這叫relaxation
Let x : a feasible solution to quadratic program
可以使得$x = xx^T$會是一個feasible solution
> 假設我們有SDP的正確答案,feasible solution不一定是正確答案
> 不是最佳解但只要轉置成x^T,會是符合SDP的條件
Let $SDP^*$ : 將 $x \in SDP$ 轉成 $y \in \{-1, 1\}^n$
證明:$SDP^* \geq 0.878 \ SDP \geq 0.878 \ OPT$
if $x \geq 0 \ and \ x_{ii} = 1, \ 1 \leq i \leq n$
$\Rightarrow x = U^TU \ where \ U = [u_1, u_2,...,u_n], \ u_i \in S^{n-1} \ (unit \ vector)$
* rounding
1. 現在有n個unit vector $u_1, u_2,...,u_n$
2. 隨便挑一個隨機的unit vector $y \in S^{n-1}$
$$
y_i =
\begin{cases}
1, & \text{if $y^T \ u_i \geq 0$} \\
-1, & \text{otherwise}
\end{cases}
$$
所以現在
$SDP = \sum\limits_{\{i,j\} \in E}\frac{1-x_{ij}}{2}=\sum\limits_{\{i,j\} \in E}\frac{u_i^Tu_j}{2}$
$SDP^* = \sum\limits_{\{i,j\} \in E}\frac{1- y_{i}y_j}{2}$
$$
y_i =
\begin{cases}
1, & \text{if $y^T \ u_i \geq 0$} \\
-1, & \text{otherwise}
\end{cases}
$$
$\Rightarrow SDP^* = 0.878 \ SDP$
$$
\because
y_iy_j =
\begin{cases}
1, & \text{if $y_i = y_j = 1$ and $y_i = y_j = -1$} \\
-1, & \text{if $y_i \neq y_j \ (1,-1) \ or \ (-1,1)$}
\end{cases}
$$
\begin{tabular}
$\therefore$ 只在意 $y_iy_j =-1$的情況,不然$SDP^*$就為零了
\end{tabular}
$\Rightarrow \ Pr \ \{y_iy_j =-1\} = \frac{2\alpha_{ij}}{2 \pi} = \frac{\alpha_{ij}}{\pi}, \ \alpha_{ij} = u_i, u_j$的夾角

$\Rightarrow$
$$
E \ [SDP^*]
= E \ [\sum\limits_{\{i,j\} \in E} \frac{1- y_{i}y_j}{2}]
= \sum\limits_{\{i,j\} \in E} Pr \ \{y_iy_j =-1\} = \sum\limits_{\{i,j\} \in E} \frac{\alpha_{ij}}{\pi}
$$
\begin{aligned}
& \because u_i^Tu_j = |u_i| \ |u_j| \ cos{\alpha_{ij}} = cos{\alpha_{ij}} \\
& \therefore \alpha_{ij} = arccos (u_i^Tu_j)
\end{aligned}
$$\Rightarrow E \ [SDP^*]
= \sum\limits_{\{i,j\} \in E} \frac{arccos \ (u_{i}^T u_{j})}{\pi}
\ \ v.s. \ \
SDP = \sum\limits_{\{i,j\} \in E} \frac{1- u_{i}^Tu_j}{2}
$$
$\Rightarrow$
$$
\frac{2 \ arccos \ (u_{i}^T u_{j})}{\pi \ (1- u_{i}^Tu_j)} \geq 0.878
\Rightarrow
E \ [SDP^*] \geq 0.878 \ SDP
$$