# PA notes ## Conditional Probability: >- $P(A|B) = \frac{P(A \cap B)}{P(B)}$ >- $P(A_1 \cup A_2 \cup ... \cup A_k | B) = P(A_1|B)+P(A_2|B)+...+P(A_k|B)$ >- A and B are indepedent iff $P(A \cap B) = P(A)P(B)$ ## Baye's Rule: >- If two evenets J and M are conditional independent given A $\Rightarrow$\ $P(J \cap M|A) = P(J|A)P(M|A)$ >- Bayes' Theorem:\ $P(B|A) = \frac{P(A|B)P(B)}{\sum_{i=1...n} P(B_i)P(A|B_i)}$ >- $P(B|A) = \frac{P(A|B)P(B)}{P(A)}$ ## PMF: >- $\mu = E[X]$ >- $\sigma = E[X^2] - E[X]^2$ >- MGF: $\Rightarrow$\ $M(t) = E[e^{tX}]$\ $M'(0) = \sum_{x\in S} xf(x) = \mu$\ $M''(0) = E[X^2]$ ## Poisson Distribution: >- Meaning: Observer the **number** of arrivals during a given period of time >- $f(x) = \frac{\lambda ^xe^{-\lambda}}{x!}$ >- $\mu = \lambda, \sigma ^2 = \lambda$ ## cdf & pdf: >- Uniform Distribution:\ $$ F_x(x)=Prob(X \leq x)=\left\{ \begin{aligned} 0 && if && x < a \\ \frac{x-a}{b-a} && if && a \leq x \leq b \\ 1 && if && x > b \end{aligned} \right. $$ $$f_x(x) = \frac{dF_x(x)}{dx}=\left\{ \begin{aligned} \frac{1}{b-a} && if && a \leq x \leq b \\ 0 && otherwise \end{aligned} \right. $$ >- $\mu = \frac{a+b}{2}, \sigma = \frac{(b-a)^2}{12}$ >- Exponential Distribution:\ Meaning: the waiting time W between two successive arrival: $F(w) = P(W \leq w) = 1 - P(W > w) = 1 - P(no \ arrival \ in \ [0, w]) = 1 - e^{-\lambda w}$ \ $f(x) = \lambda e^{-\lambda x}$ $let \ \theta = \frac{1}{\lambda}$ $M(t) = \frac{1}{1-\theta t}$ $\mu = \theta, \sigma ^2 = \theta ^2$ ## Gamma Distribution: >- Meaning: waiting time before the $\alpha$-th occurrence of the event occurs. >- $F(w) = P(W \leq w) = 1 - P(W > w)$\ = $1 - Prob(fewer \ than \ \alpha \ arrivals \ during \ [0, w])$ \ = $1 - \sum_{k=0}^{\alpha - 1} \frac{(\lambda w)^k}{k!}e^{-\lambda w} for \ w \geq 0$ >- $f(w) = \frac{\lambda (\lambda w)^{\alpha - 1}}{(\alpha - 1)!}e^{-\lambda w}$ >- $f(x) = \frac{\lambda ^\alpha x^{\alpha - 1}e^{-\lambda x}}{(\alpha - 1)!}$ >- $\mu = \alpha \theta, \sigma ^2 = \alpha \theta ^2$ >- Chi-square Distribution:\ spacial case of Gamma distribution as $\frac{1}{\lambda}$ = $\theta$ = 2, $\alpha$ = r / 2 where r is a positive integer ## Normal Distribution: >- $f_x(x) = \frac{1}{\sigma \sqrt{2\pi }}e^{\frac{-(x-\mu )^2}{2\sigma ^2}}$ >- mgf $M(t) = e^{\mu t + \sigma ^2 t^2 / 2}$ >- Standard Normal Distribution:\ $\mu = 0 \ and \ \sigma = 1$ $p.d.f \ : \ \frac{1}{\sqrt{2\pi }}e^{-\frac{1}{2} x^2}$ \ $c.d.f \ : \ \Phi (x) = P(X \leq x) = \int_{-\infty }^{x} \frac{1}{\sqrt{2\pi}}e^{\frac{-w^2}{2}}dw$ >- $P(a \leq X \leq b) = P(\frac{\alpha - \mu}{\sigma} \leq \frac{X - \mu}{\sigma} \leq \frac{b - \mu}{\sigma}) = \Phi (\frac{b - \mu}{\sigma}) - \Phi (\frac{\alpha - \mu}{\sigma})$ ## Chebyshev's Inequality: >- $P(|X - \mu | \geq k\sigma ) \leq \frac{1}{k^2}$ ## Correlation Coefficient: >- $Cov(X_1, X_2) = E[(X_1-\mu _1)(X_2-\mu _2)] \Rightarrow \sigma _{12}$ \ = $E[X_1X_2] - \mu _1 \mu _2$ >- correlation coefficient of $X_1$ and $X_2$: \ $\rho = \frac{Cov(X_1, X_2)}{\sigma _1 \sigma _2}$ (note: $-1 \leq \rho \leq 1$) >- Independent = Zero Covariance >- Zero Covariance $\neq$ Independent ## Central Limit Theorem: >- If $\bar{X}$ is the mean of a random sample $X_1, X_2, ..., X_n$ from a distribution with mean $\mu$ and variance $\sigma^2$. Then: \ $\bar{X}$ approaches $N(\mu, \frac{\sigma}{n})$ as n $\rightarrow$ $\infty$ >- or $Y = X_1 + X_2 + ... + X_n$ is $N(n\mu, n\sigma^2)$ ## Several Independent R.V: >- If $Y = \sum_{i=1}^{n} a_iX_i$ where $a_1, a_2, ..., a_n$ are real numbers: \ $\mu_y = \sum_{i=1}^{n} a_i\mu_i$, $\sigma_y^2 = \sum_{i=1}^{n} a_i^2\sigma_i^2$ >- If $Y = a_1X_1 + a_2X_2 + ... + a_nX_n$: \ $M_{Y(t)} = \prod_{i=1}^{n}M_{X_i}(a_it)$ ## Estimate: >- Note: Sample mean $\bar{x} = \frac{1}{n}\sum_{i}X_i$, Sample variance $s^2 = \frac{1}{n-1}\sum_{i}(X_i - \bar{x})^2$ ## Maximum Likelihood Estimator: >- let $\{x_1, x_2, ..., x_n\}$ be a set of independent random samples from distribution with unknown parameters $\{\theta_1, \theta_2, ..., \theta_m\}$ $\Rightarrow$ $f(x;\theta_1, \theta_2, ..., \theta_n)$ \ calculate $\theta_1'$ = $\frac{\partial L(\theta_1, \theta_2, ..., \theta_m)}{\partial\theta_1}$, $\theta_2'$ = ... ## Methods of Moments: >- Because $\frac{\sum X_i}{n} \approx E[X]$, $\frac{\sum X_i^2}{n} \approx E[X^2]$, $\frac{\sum X_i^n}{n} \approx E[X^n]$ \ further more, m.g.f: $M'(0) = E[X], M''(0) = E[X^2], ..., M^{(r)}(0) = E[X^r]$