--- title: 機率TP8 --- # Theory of Probability:<br>Continuous Random Variables - Basics NTNU 機率論 ##### [Back to Note Overview](https://reurl.cc/XXeYaE) ##### [Back to Theory of Probability](https://hackmd.io/@NTNUCSIE112/H1OPnkA4v) ###### tags: `NTNU` `CSIE` `必修` `Theory of Probability` ## Normality is Preserved by Linear Transformations If $X$ is a normal random variable with mean $\mu$ and variance $\sigma^2$, and if $a(a\neq 0)$ and $b$ are scalars, then the random variable $$ Y = aX+b $$ is also normal with mean and variance $$ E[Y] = a\mu+b\\ \text{var}(Y)=a^2\sigma^2 $$ ## Standard Normal Random Variable - A normal random variable $Y$ with zero mean $\mu = 0$ and unit variance $\sigma^2=1$ is said to be a **standard normal** $$ f_Y(y) = {1\over \sqrt{2\pi}}e^{-{y^2\over 2}},\ -\infty \leq y \leq \infty $$ - Normalization Property $$ \int^\infty_{-\infty}\frac{1}{\sqrt{2\pi}}e^{-\frac{y^2}{2}}dy=1 $$ - The standard normal is symmetric around $y=0$ ### The PDF of a Random Variable Can be Arbitratily Large ![](https://i.imgur.com/rBs8KCP.png) - Consider a random variable $X$ with PDF $$ f_X(x)=\begin{cases} \frac{1}{2\sqrt{x}},&\text{if }0<x\leq1,\\ 0,&\text{otherwise}, \end{cases} $$ - The PDF value becomes infinite large as $x$ approaches zero - Normalizatino Property $$ \int^1_0f_X(x)dx=\int^1_0\frac{1}{2\sqrt{x}}dx=\sqrt{x}|^1_0=1 $$ ## Expectation of a Continuous Random Variable - Let $X$ be a continuous random variable with PDF $f_x$ - The expectaion of $X$ is defined by $$ E[X]=\int^\infty_{-\infty}x\cdot f_X(x)dx $$ - The expectation of a function $g(X)$ has the form $$ E[g(X)]=\int^\infty_{-\infty}g(x)\cdot f_X(x)dx $$ - The variance of $X$ is defuned by $$ \text{var}(X)=E[X^2]-(E[X])^2\geq0 $$ - we also have $$ \text{var}(X)=E[X^2]-(E[X])^2\geq0 $$ - if $Y = aX+b$, where $a$ and $b$ are given scalars, then $$ E[Y]=aE[X]+b\\ \text{var}(Y)=a^2\text{var}(X) $$ ### Expaonential Random Variable - An **expaonential** random variable $X$ has a PDF if the form $$ f_X(x)=\begin{cases}\lambda e^{-\lambda x},&\text{if }x\geq 0,\\ 0,&\text{otherwise} \end{cases} $$ - Normalization Property $$ \int_{-\infty}^\infty f_X(x)dx=\int_0^\infty\lambda e^{-\lambda x}dx=-e^{-\lambda x}|^\infty_0=1 $$ - The probability that $X$ exceeds a certain value decreasese exponentially $$ P(X\geq a)=\int_a^\infty\lambda e^{-\lambda x}dx=e^{-\lambda a} $$ ## Properties of a CDF - The CDF $F_X(x)$ is *monotonically(單調) non-decreasing* - if $x_i\leq x_j$, then $F_X(x_i)\leq F_X(x_j)$ - The CDF $F_X(x)$ tends to 0 as $x\to-\infty$, ans to 1 as $x\to\infty$ - If $X$ is deicrete, then $F_X(x)$ is a *piecewise constant* function(分段常值函數) of $x$.![](https://i.imgur.com/fJGWDJE.png) - If $X$ is continuous, then $F_X(x)$ is a *continous* function of $x$.![](https://i.imgur.com/67OqSz1.png) - If $X$ is descrete and takes integer values, the PMF and the CDF can be obtained from each other by summing or differencing. $$ \begin{align} F_X(k)&=P(X\leq k)&=\sum_{i=-\infty}^kp_X(i),\\ p_X(k)&=P(X\leq k)-P(X\leq k-1)&=F_X(k)-F_X(k-1) \end{align} $$ - If X is continous, the PDF and the CDF can be obtained from each other by integration or differentiation $$ F_X(x)=P(X\leq x)=\int_{-\infty}^\infty f_X(t)dt,\\ f_X(x)=\frac{dF_X(x)}{dx} $$ - The second equality is valid for those $x$ for which the CDF has a derivative (or at which the PDF is continuous) ## Example(slide(TP8) p25) ![](https://i.imgur.com/NWAImQ9.png) <!-- X只有3個 X=min{X_1, X_2, X_3} P(X_1>k), P(X_2>k), P(X_3>k) independent --> ## CDF of the **Standard Normal** - The CDF of the standard normal $Y$, denoted as $\phi(y)$, is recorded in a table and is a very useful tool for calculating various probabilities, including normal variables. $$ \phi(y)=P(Y\leq y)=P(Y<y)=\int_{-\infty}^y\frac{1}{\sqrt{2\pi}}e^{\frac{-t^2}{2}}dt $$ - The table only provides the value of $\phi(y)$ for $y\geq 0$ - Because the symmetry of the PDF, the CDF at negative values of $Y$ can be computed from corresponding positive ones $$ \begin{align} \phi(-0.5)&=P(Y\leq -0.5)=1-P(Y\leq 0.5)\\ &=1-\phi(0.5)=1-0.6915\\ &=0.3085 \end{align} $$ > $\phi(-y)=1-\phi(y), \forall y$ ## Table of the CDF of **Standard Normal** ![](https://i.imgur.com/L7RXm5Y.png =x500) ## CDF Calculation of the **Normal** - The CDF of a normal random variable $X$ with mean $\mu$ and variance $\sigma^2$ is obtained using the standard normal table as $$ P(X\leq x)=P\left(\frac{X-\mu}{\sigma}\leq\frac{x-\mu}{\sigma}\right)=P\left(Y\leq\frac{x-\mu}{\sigma}\right)=\phi\left(\frac{x-\mu}{\sigma}\right) $$ - Let $Y=\frac{X-\mu}{\sigma}$, since $X$ is normal and $Y$ is a linear function of $X$, $Y$ hence is also normal (with mean 0 and variance 1). $$ E[Y]=\frac{E[X]-\mu}{\sigma}=0, \text{var}(Y)=\frac{\text{var}(X)}{\sigma^2}=1 $$ ## Relation between the **Geometric** and **Exponential** - The CDF of the geometric - The CDF of the exponential - Compare the above two CDFs and let <!-- 他講了啥你可以順便打起來放註解 不包含廢話的部分 不然到時候又要忘記了 我正在聽 -->