---
tags: math, memo
---
# Probability Spaces
## probability measure
Suppose $\mathcal{F}$ is a $\sigma$-algebra on a set $\Omega$.
- A *probability measure* on $(\Omega, \mathcal{F})$ is a measure $P$ on $(\Omega, \mathcal{F})$ such that $P(\Omega) = 1$.
- $\Omega$ is called the *sample space*.
- An *event* is an element of $\mathcal{F}$.
- If $A$ is an event, then $P(A)$ is called the *probability* of $A$.
- If $P$ is a probability measure on $(\Omega, \mathcal{F})$, then the triple $(\Omega, \mathcal{F}, P)$ is called a *probability space*.
### almost surely
Suppose $(\Omega, \mathcal{F}, P)$ is a probability space. An event $A$ is said to happen *almost surely* if the probability of $A$ is $1$, or equivalently if $P(\Omega \setminus A) =0$.
## random variable; expectation
Suppose $(\Omega, \mathcal{F}, P)$ is a probability space.
- A *random variable* on $(\Omega, \mathcal{F})$ is a measurable function from $\Omega$ to $\mathbb{R}$.
- If $X \in \mathcal{L}^1(P)$, then the *expectation* or *expected value* of the random variable $X$ is denote $EX$ and is defined by
$$ EX = \int_{\Omega} X \text{d} p. $$
### probability distribution and distribution function
Suppose $(\Omega, \mathcal{F}, P)$ is a probability space and $X$ is a random variable. $\mathcal{B}$ is the collection of Borel subsets.
- The *probability distribution* of $X$ is the probability measure $P_X$ defined on $(\mathbb{R}, \mathcal{B})$ by
$$ P_x(B) = P(X \in B) = P(X^{-1}(B)).$$
- The *distribution function* of $X$ is the function $\tilde{X}: \mathbb{R} \to [0,1]$ defined by
$$ \tilde{X}(s) = P_X \left( (-\infty, s] \right) = P(X \le s)=P\left (\{\omega \in \Omega: X(\omega) \le s\}\right).$$
## characterization of distribution functions
Suppose $H: \mathbb{R} \to [0,1]$ is a function. Then there exists a probability space $(\Omega, \mathcal{F}, P)$ and a random variable $X$ on $(\Omega, \mathcal{F})$ such that $H = \tilde{X}$ if and only if the following conditions are satisfied:
1. $s<t \implies H(s) \le H(T)$. That is, $H$ is an increasing function;
2. $\lim_{t \to -\infty} H(t) = 0$;
3. $\lim_{t \to \infty} H(t)=1$;
4. $\lim_{t \downarrow s} H(t) = H(s)$ for every $S \in \mathbb{R}$. That is, $H$ is right continuous.
## density function
Suppose $X$ is a random variable on some probability space. If there exists $h \in \mathcal{L}^1(\mathbb{R})$ such that
$$ \tilde{X}(s) = \int_{- \infty}^{s} h \text{d} \lambda$$
for all $s \in \mathbb{R}$, then $h$ is called the *density function* of $X$.
## graph
Let $I=[0,1]$, **B** be the collection of Borel subsets, the system of the probability can be graphed as:
```graphviz
digraph world {
size="7,7";
{rank=same; Ω F;}
{rank=same; R I;}
#layout=neato
Ω -> F [label = "σ-algebra"];
Ω -> R [label = "X" fontsize = 20];
F -> I [label = "P" fontsize = 20];
R -> B [label = "σ-algebra"];
B -> I [label="Px" fontsize = 20]
}
```
## Reference
Sheldon Axler. *Measure, integration & real analysis.* Springer Nature,
2020.