---
title: 機率TP9
---
# Theory of Probability<br>Continuous Random Variables: Joint PDFs, Conditioning, Expectation and Independence
NTNU 機率論
##### [Back to Note Overview](https://reurl.cc/XXeYaE)
##### [Back to Theory of Probability](https://hackmd.io/@NTNUCSIE112/H1OPnkA4v)
###### tags: `NTNU` `CSIE` `必修` `Theory of Probability`
## Multiple Continuous Random Variables
- Two continuous random variables $X$ and $X$ associated with a common experiment are jointly continuous and can be described in term of a joint PDF $f_{X,Y}$ satisfying
- $f_{X,Y}$ is a non-negative function
- Normalization Probability
$\int^\infty_{-\infty}\int^\infty_{-\infty} f_{X,Y}(x,y)dxdy= 1$
- Similarly, $f_{X,Y}(a,c)$ can be viewed as the "probabilty per unit area" in the vicinity of (a,c)
$$
P(a\leq X\leq a+\delta,c\leq Y\leq c+\delta)\\
=\int_a^{a+\delta}\int_c^{c+\delta}f_{X,Y}(x,y)dxdy=f_{X,Y}(a,c)\cdot\delta^2
$$
- Where $\delta$ is a small positive number
- Marginal Probability
$$
\begin{align}
P(X\in A)&=P(X\in A \text{ and }Y\in(-\infty,\infty))\\
&=\int_{X\in A}\int^\infty_{-\infty}f_{X,Y}(x,y)dydx
\end{align}
$$
- We have already defined that
$$
P(X\in A)=\int_{X\in A}f_X(x)dx
$$
- We thus have the *marginal PDF*
$$
f_X(x)=\int^\infty_{-\infty}f_{X,Y}(x,y)dy
$$
Similary
$$
f_Y(y)=\int^\infty_{-\infty}f_{X,Y}(x,y)dy
$$
## Joint CDFs
<!-- slide 5 -->
## Conditional Expectation Given an Event
<!-- slide 17 -->
- The conditional expectation of a continuous random variable $X$, given an event $A$ ($P(A)>0$)
-
## Conditional Expectation Given an Random Variable
<!-- slide 19 -->
- The properties of unconditional expectation carry though, with the obvious modifications, to conditional expectation
## Total Probability/Expectation Theorems
- Total Probability Theorem
- For any event $A$ and a continuous random variable $Y$
- Total Expectation Theorem
- For any continuous random variable $X$ and $Y$
## Independence
<!-- slide 22 -->
- Two continuous random variables $X$ and $Y$ are independent if
$$
f_{X,Y}(x,y) = f_X(x)f_Y(y), \forall x, y
$$
- Since that
$$
f_{X,Y}(x,y)=f_Y(y)f_{X|Y}(x|y)=f_X(x)f_{Y|X}(y|x)
$$
- We therefore have
$$
f_{X|Y}(x,y)=f_X(x),\forall y\text{ if }f_Y(y)>0
$$
- Or
## More Factors about Independence
<!-- Slide 23 -->
- If two continuous random variables $X$ and $$ are independence, then
- Any two events of the forms {X \in A} an {Y \in B} are independent
- It also implies that
- The converse statement is also true (See the end-of-chapter problem 32)
<!-- Slide 24 -->
- If two continuous rndom variables $X$ and $Y$ are independent, then
- $E[XY] = E[X]E[Y]$
## Recall: the Discrete Bayes' Rule
- Let $A_1, A_2, \cdots, A_n$ be disjoint events that form a partition of the sample space, and assume that $P(A_i) \geq 0$ for ll $i$. Then, for any event $B$ such that $P(B) > 0$ we have
$$
\begin{align}
P(A_i|B)&=\frac{P(A_i)P(B|A_i)}{P(B)}\\
&=\frac{P(A_i)P(B|A_i)}{\sum_{k=1}^nP(A_k)P(B|A_k)}\\
&=\frac{P(A_i)P(B|A_i)}{P(A_1)P(B|A_1)+\cdots+P(A_n)P(B|A_n)}
\end{align}
$$
## Inference and Continuous Bayes' Rule
- As we have a model odf underlying but unobserved phenomenon, represented by a random variable $X$ with PDF $f_X$, and we make noisy measurement Y, which is modeled in terms of a conditional PDF $f_{Y|X}$. Once the expermntal value of $Y$ is measured, what information does this provide on the unknown value of $X$?