## 微積分
#### 夾擠定理
If $a\in$ open intrval,$f(x)\le g(x) \le h(x)$, $\lim \limits_{x \to a}f(x)=L=\lim \limits_{x\to a} h(x)$, then $\lim \limits_{x \to a} g(x)=L$
#### 極限嚴格定義
$\forall ε, \exists δ, s.t |f(x)-\lim\limits_{x\to a}f(x)|<ε$, $|x-a|<δ$
#### 極限嚴格定義(無窮極限版)
$\forall M>0, \exists \delta >0$ s.t $0<|x-a|<\delta$, then $\lim \limits_{x \to a}f(x)=\infty$
#### 多變量函數極限的嚴格定義
Let $\lim \limits_{(x, y) \to (a,b)}f(x,y)=l$, for all $ε>0$,
when $0<\sqrt {(x-a)^2+(y-b)^2} <δ$, $\fbox{ |f(x,y)-l|<ε}$成立
:::info
即:
$(x,y)$循各種途徑到達$(a,b)$的極限均為$l$$(\impliedby只要有一條到達(a,b)的極限不為0,\lim \limits_{(x,y) \to (0,0)}f(x,y)\neq0$
:::
#### 斜漸進線
$y=mx+b$
$\lim\limits_{x \to \infty}\frac{f(x)}{x}=m$
$\lim\limits_{x \to \infty}f(x)-mx=b$
#### intermediate value theorem(中間值定理)
Given $f(x)$ is a continuous function on $[a,b]$, if $\exists L \in (f(a), f(b))$ or $f(b), f(a))$, then must $\exists c \in (a,b)$ s.t $f(c)=L$
#### 均值定理積分版
Let f in [a,b]中連續,在(a,b)可微,則(a,b)內至少有一點c滿足:
$f(c)=\frac{1}{b-a}\int _a^bf(x)dx$
#### some sum formula:
1. $\sum \limits _{k=1} ^n k^2 = \frac{n(n+1)(n+2)}{6}$
2. $\sum \limits _{k=1} ^n k^3 = [\frac{n(n+1)}{2}]^2$
#### 反函數定理與性質
Let $f$ is 1-1, then exist $f^{-1}=g(x)$, such that:
$g[f(x)]=f[g(x)]=x$, then $g(x)$ has following properties:
##### 特質:
1. $g'(x)=$$\frac{1}{f'[g(x)]}$
2. if f 在 $x=c$ 處可微+$f'(c)\neq 0$,則$f^{-1}(c)$ also 可微(i.e 左極限=右極限)
## 機率論
#### 不等式
##### Markov's inequlity
Let $Y$ ba nonegative random variable, then:
$P(Y>a)\le \frac {E(Y)}{a}$ for any constant $a>0$
:::info
:paperclip: proof:
for a>0 let I =1 if $x\ge a$, 0 otherwise
since $x\ge a$, $I\le a$
taking expectations of I, we get:
$E(I)\le \frac{E(X)}{a}$
:::
##### Chebyshev's inquality
###### Two-sided Chebyshev's inquality
Let $X$ ba random variable, $\mu$ as expected value and $σ$ as standard deviation, then:
$P(|X-\mu|\ge c) \le \frac {σ^2}{c^2}$ for any constant $c>0$
:::info
:paperclip: apply Markov's inequality with $(x-\mu)^2>0$
$P[(x-\mu)^2\ge k^2]\le frac{E[(x-\mu)^2]}{k^2}$
since $(x-\mu)^2$\ge k^2 \to |x-\mu| \ge k$,
$P(|x-\mu|\ge k) \le frac{E[(x-\mu)^2]}{k^2}= \frac{\sigma^2}{k^2}$
:::
###### One-sided Chebyshev's inquality
For any constant $c>0$,
$P(X \ge \mu +c)\le \frac {σ^2}{σ^2+c^2}$ and $P(X \ge \mu -c)\le \frac {σ^2}{σ^2+c^2}$
:::info
:paperclip: let X is a random variable with mean 0 and $\sigma^2< \infty$, for all a>0, let b>0
$X\ge a \to X+b \ge a+b$
Hence,
$P(X \ge a)=P(X+b \ge a+b) \ge P[(X +b)^2 \ge (a+b)^2]$
apply Markov's inequality
$P(X\ge a)\le \frac{E[(X+b)^2]}{(a+b)^2} = \frac {\sigma^2+b^2}{(a+b)^2}$$=\frac{\sigma^2+(\sigma^2/a)}{(a+ (\sigma^2/a))^2}$
:::
##### Jensen' s inequality
If $f(x)$ is a convex function, then
$E[f(X)] \ge f(E[X])$
::: info
:paperclip:
by Taylor expension:
$f(x) \ge f(c)+f'(c)(x-c)$, where c is a constant.
take $c= E(x)$,
$f(x)\ge f[E(x)] + f'[E(x)][x-E(x)]$
take $E(X)$ to both side,
$E[f(x)] \ge E[f(E(x))] +0 = g[E(x)]$, $\because g[E(x)]$ is a number already.
:::