# 16 - Infinite Probability Spaces, the Law of Total Probability
## Infinite probability spaces
<u>Recall</u>: A sample space is a *non-empty countable* set.
Countable: There is a 1:1 relation between the elements of a set and the integers
Example: toss a coin until the first head
Set of outcomes
$S = \{H, TH, TTH, TTTH, ....\} = \{T^nH: n \in \mathbb{N}\}$
For any non-negative integer $n$ there is a outcome where exactly $n$ tails are tossed before a heads.
$Pr(H) = \frac{1}{2}$
$Pr(TH) = \frac{1}{4}$
$Pr(TTH) = \frac{1}{8}$
$Pr(T^nH) = \frac{1}{2^{n+1}}$
$\sum_{\omega \in S}Pr(\omega) = \sum_{n=0}^{\infty} Pr(T^nH)$
$= \sum_{n=0}^\infty (\frac{1}{2})^{n+1} = \lim_{N \rightarrow \infty} \sum_{n=0}^N (\frac{1}{2})^{n+1}$ These two things are the same thing
<u>Claim:</u> $\sum_{i=0}^N x^i = \frac{1-x^{N+1}}{1-x}, (0 < x < 1)$
$x^0 + x^1 + x^2 +x^3 + ... x^N = \frac{1-x^{N+1}}{1-x}$
$(1-x)(x^0 + x^1 + x^2 +x^3 + ... x^N) = 1-x^{N+1}$ (multiply both sides by $1-x$)
$(x^0 + x^1 + x^2 +x^3 + ... x^N) -x^1 - x^2 - x^3 - x^4 - ... - x^N - x^{N+1}= x^0 - x^{N+1} = 1-x^{N+1}$
<u>Therefore:</u> $\sum_{i=0}^\infty x^i = \lim_{N \rightarrow \infty} \frac{1-x^{N+1}}{1-x} = \frac{1}{1-x}$
Back to this: $\sum_{n=0}^N (\frac{1}{2})^{n+1} = \frac{1}{2}\sum_{n=0}^N (\frac{1}{2})^{n} = \frac{1}{2}(\frac{1}{1-(1/2)}) = \frac{1}{2}\frac{1}{(1/2)} = 1$
We verified that if we add up the probability of all the events in the sample space, we get 1.
---
Studying a game "First head wins"
Two people Alice and Bob, take turns flipping a coin, first person to toss a head wins
$A = \text{Alice wins} = \{H, TTH, TTTTH, ...\}$
$= \{T^{2n}H : n \in \mathbb{N}\}$
$Pr(A) = \sum_{\omega \in A} Pr(\omega) = \sum_{n=0}^\infty Pr(T^2nH)$
$=\sum_{n=0}^\infty (\frac{1}{2})^{2n+1} = \frac{1}{2}\sum_{n=0}^\infty (\frac{1}{2})^{2n}$
$\frac{1}{2} \sum_{n=0}^\infty (\frac{1}{4})^{n} = \frac{1}{2}(\frac{1}{1-(1/4)}) = \frac{1}{2}(\frac{1}{3/4}) = \frac{1}{2} \times \frac{4}{3}=\frac{2}{3}$
Takeaway: Probability that first player wins is $\frac{2}{3}$
---
"Second head wins"
$S = \{T^nHT^mH : n,m \in \mathbb{N}\}$
At least n +m + 2 coins need to be tossed before the game is won
$Pr(T^nHT^mH) = (\frac{1}{2})^{n +m + 2}$
$A = \text{Alice wins}$
$A_1 = \text{Alice tosses the first head and later wins}$
$A_2 = \text{Bob tosses the first head and leter alice wins}$
$A_1$ and $A_2$ are disjoint
$Pr(A) = Pr(A_1 \cup A_2) = Pr(A_1) + Pr(A_2)$
For alice to toss the first head, the first number of tails needs to be even and the second number of tails needs to be odd
$A_1 = \{T^{2k}HT^{2l+1}H : k,l \in \mathbb{N}\}$
$Pr(A_1) = \sum_{k=0}^\infty \sum_{l=0}^\infty Pr(T^{2k}HT^{2l + 1}H)$
$= \sum_{k=0}^\infty (\sum_{l=0}^\infty (\frac{1}{2})^{2k +2l +3})$
$= \sum_{k=0}^\infty ((\frac{1}{2})^{2k+3} \sum_{l=0}^\infty (\frac{1}{2})^{2l})$
$= \sum_{k=0}^\infty (\frac{1}{2})^{2k+3} \times \frac{4}{3}$
$Pr(A_1)= \frac{4}{3} \times (\frac{1}{2})^3 \times \sum_{k=0}^\infty = (\frac{1}{2})^3 (\frac{4}{3})^2 = \frac{2}{9}$
$A_2 = \{T^{2k+1}HT^{2l}H : k,l \in \mathbb{N}\}$
$Pr(A_2) = \sum_{k=0}^\infty\sum_{l=0}^\infty (\frac{1}{2})^{2k +2l +3} = \frac{2}{9}$
$Pr(A_1 \cup A_2) = \frac{4}{9} < \frac{1}{2}$ So alice is at a disadvantage
## Law of total probability
If there are two events $A$ and $B$ where $Pr(B) > 0$ then:
$Pr(A) = Pr(B) \times Pr(A | B) + Pr(\bar{B}) \times Pr(A | \bar{B})$
Proof:
$=Pr(B) \times \frac{Pr(A \cap B)}{Pr(B)} + Pr(\bar{B}) \times \frac{Pr(A \cap \bar{B})}{Pr(\bar{B})} = Pr(A \cap B) + Pr(A \cap \bar{B})$

Back to the coin game
$X = \text{Alice gets the first head}$
$Pr(X) = Pr(X) \times Pr(A | X) + Pr(\bar{X}) \times Pr(A | \bar{X})$
$Pr(X) = \frac{2}{3}$
$Pr(\bar{X}) = 1 - \frac{2}{3} = \frac{1}{3}$
$\bar{X} = \text{Bob tossed the first head}$ If he tosses the first head, then the chances of alice tossing the first head after that is $\frac{2}{3}$
$X = \text{Alice tosses the first head}$ meaning that now Bob is the first to toss after that. The chance that bob gets the first head is $\frac{2}{3}$ and Alice now has a probability of $\frac{1}{3}$ of winning
$Pr(X) = Pr(X) \times Pr(A | X) + Pr(\bar{X}) \times Pr(A | \bar{X})$
$Pr(X) = \frac{2}{3} \times \frac{1}{3} + \frac{1}{3} \times \frac{2}{3} = \frac{4}{9}$
###### tags: `COMP2804` `Probability`