## $\ell_p$ hardness
Let $k$ be a power of $2$. Assume we sample $m = ck$ columns where $c\epsilon = \beta$ for some large $\beta$. Consider the matrix $A$ as defined below,
$\begin{align*}A = \begin{bmatrix} \alpha H_1 & \alpha H_2 & \dots &\alpha H_2 & e_1 &\dots e_1 & e_2 &\dots e_2 & \dots & e_k &\dots e_k\\ 0 & 0 & \dots & 0 & \sigma_1\delta &\dots \sigma_1\delta & \sigma_2\delta &\dots \sigma_2\delta & \dots & \sigma_k\delta &\dots \sigma_k\delta\end{bmatrix}\end{align*}$
where $H_i \in R^k$ is the $i$th Hadamard vector where $|H_{ij}| = \frac{1}{\sqrt{k}}$, $e_i$ is the $i$th standard basis, $\alpha = \frac{k^{1/2-1/p}c^{1/p}(1+\delta^p)^{1/p}}{(4c-1)^{1/p}} \ge \frac{k^{1/2-1/p}}{4^{1/p}}$, $\delta = \frac{6^{1/p}\beta^{1/p}}{k^{1/2-2/p}}$ and $\sigma_i =\begin{cases} +1 & w.p. \; 1/2\\ -1 & w.p. \; 1/2\end{cases}$ . Each $e_i$ is repeated $c$ times. Let $\beta \ll \frac{k^{p/2-1}}{6}$, we can therefore also see that $\delta \ll 1$.
We can see that,
$\|A\|_{F_p}^p = \frac{\alpha^p}{k^{p/2}}k^2+ck + c\delta^p k = \frac{c(1+\delta^p)}{4c-1} k+c(1+\delta^p) k = c(1+\delta^p) k \frac{4c}{4c-1} = 4ck \frac{\alpha^p k}{k^{p/2}}$
We can see that then for each column $H_i$, $\frac{\|H_i\|_p^p}{\|A\|_{F_p}^p} = \frac{1}{4ck}$. Given this, we can derive the probability we do not observe all $H_i$ columns for $i \in [k]$. Let $E_i$ be the event we do not observe column $H_i$, then,
$Pr(E_i) = \left(1-\frac{\|H_i\|_p^p}{\|A\|_{F_p}^p}\right)^{ck} = \left(1-\frac{1}{4ck}\right)^{ck} \ge 1-\frac{1}{4}$
#### Number of sampled hadamard columns is less than $k/2$
Let $Z$ be the number of sampled columns from $H_i$ for $i \in [k]$.
$Z = \sum_{i=1}^k z_i$
where $z_i$ is 1 if $H_i$ is selected and $0$ otherwise.
Then $Pr(z_i = 1) = \left(1-\left(1-\frac{1}{4ck}\right)^{ck}\right) \le \left(1-\left(1-\frac{1}{4}\right)\right) = 1/4$ and therefore,
$E(Z) = k\left(1-\left(1-\frac{1}{4ck}\right)^{ck}\right) \le k/4$
Using Markov, we can see that,
$Z \le 2E(z) \le k/2$ with probability $\ge 1/2$. Therefore, with constant probability we only observe $1/2$ of the columns.
#### Arguing about the expected error.
Assume $S$ be the set of unsmapled columns from Hadamard columns. For each $H \in S$, we can consider the reconstructions $\alpha H \approx \sum_{i=1}^k \frac{\alpha}{\sqrt{k}} sgn(H_i) e_i+\sum_{i=1}^k\frac{\alpha\delta}{\sqrt{k}}sgn(H_i)\sigma_i$
We can therefore, the error incurred in this reconstruction is $L_H = |\sum_{i=1}^k\frac{\alpha\delta}{\sqrt{k}}sgn(H_i)\sigma_i|^p = \frac{\alpha^p \delta^p}{k^{p/2}}|\sum_{i=1}^k sgn(H_i)\sigma_i|^p$
And therefore, the total error is,
$L = \sum_{H \in S} L_H = \frac{\alpha^p \delta^p}{k^{p/2}}\sum_{H \in S}|\sum_{i=1}^k sgn(H_i)\sigma_i|^p$
Using Khintchine Inequality, we can see that,
$E(L) = \frac{\alpha^p \delta^p}{k^{p/2}}\sum_{H \in S}E|\sum_{i=1}^k sgn(H_i)\sigma_i|^p \ge \frac{\alpha^p \delta^p}{k^{p/2}}\sum_{H \in S}\left(\left(\sum_{i=1}^k sgn(H_i)^2\right)^{1/2}\right)^p \ge \sum_{H \in S}\frac{\alpha^p \delta^p}{k^{p/2}}k^{p/2} = \sum_{H \in S} \alpha^p \delta^p$
Now given our choice of $\alpha,\delta$, we get that,
$\alpha^p \delta^p \ge 6c\epsilon = 6\beta$
Since $c > \frac{\alpha^p k}{k^{p/2}}$ and $\delta \ll 1$, this implies
$\alpha^p \delta^p > 2\epsilon \left(\frac{\alpha^p}{k^{p/2}}k+c + c\delta^p \right)$
$\alpha^p \delta^p \frac{k}{2} > \epsilon \left(\frac{\alpha^p}{k^{p/2}}k^2+ck + c\delta^pk \right) = \epsilon \|A\|_{F_p}^p$
Therefore with probability $\ge 1/2$,
$E(L) = \sum_{H \in S} \alpha^p \delta^p > \epsilon \|A\|_{F_p}^p$
Therefore, assuming we only sample $\frac{\beta}{\epsilon} k \ll \frac{k^{p/2}}{6\epsilon}$ columns, we would not be able to get an additive error approximation of $\epsilon$
{"title":"Lp Lower Bound","description":"Case 1: Assume p > 4 is even,","contributors":"[{\"id\":\"18a3e16b-bcdd-4bdb-870e-2d12a0ae166b\",\"add\":10701,\"del\":6940}]"}