# 17 - Random variables and expected values
## Random variables
> A random variable is neither random nor a variable
<u>Definition:</u> Let $S$ be a sample space. Then a **random variable** on $S$ is a function $X:S \rightarrow \mathbb{R}$
Where $S$ is an outcome of an experiment mapped to a real number.
Ex: Rolling two dice
$S = \{(i, j) : i,j \in \{1,2,3,4,5,6\}\}$
$X(i,j) = i + j$
Ex: Flipping three coins
$S = \{HHH, HHT, HTH, HTT, THH, THT, TTH, TTT\}$
Random variables you could define:
$X(\omega ) = \text{The number of heads in } \omega$
$X(\omega ) = 0 \text{ if } \omega= TTT$
$X(\omega ) = 1 \text{ if } \omega \in \{HTT, THT, TTH\}$
$X(\omega ) = 2 \text{ if } \omega \in \{HHT, HTH, THH\}$
$X(\omega ) = 3 \text{ if } \omega= HHH$
$Y(\omega ) = 1 \text{ if outcome is all heads or all tails}$
$Y(\omega ) = 0 \text{ otherwise}$
### Defining events in terms of random variables
An informal way of describing the event in which the outcome results in a value of something
Ex:
$X=0 = \{TTT\}$
$X=1 = \{HTT, THT, TTH\}$
$"X = \chi" = \{\omega \in S: X(\omega)=\chi \}$
"$X$ equals some value is another way of saying: for every outcome $\omega \in S$ $X$ of $\omega$ equals that value"
### Random independence
<u>Definition: </u> In a probability space $(S, Pr)$ two random variables $X$ and $Y$ are independent if for every $x,y \in \mathbb{R}$
$Pr(X=x \cap Y=y) = Pr(X=x) \times Pr(Y=y)$
The coin toss example is an example of non independence
$Pr(X=1 \cap Y=1) = Pr(\emptyset) \neq \frac{3}{8} \times \frac{2}{8}$
Another ex:
$Z(\omega ) = 1 \text{ if the first coin is head} = \{HHH, HHT, HTH, HTT\}$
$Z(\omega ) = 0 \text{ if the first coin is tail} = \{THH, THT, TTH, TTT\}$
Are $Y$ and $Z$ independent?
$Pr(Y=0) = \frac{6}{8}$
$Pr(Y=1) = \frac{2}{8}$
$Pr(Z=0) = \frac{1}{2}$
$Pr(Z=1) = \frac{1}{2}$
$Pr(Y=0 \cap Z=0) = \{THH, THT, TTH\} = \frac{3}{8} = \frac{6}{8} \times \frac{1}{2}$
$Pr(Y=0 \cap Z=1) = \{HHT, HTH, HTT\} = \frac{3}{8} = \frac{6}{8} \times \frac{1}{2}$
$Pr(Y=1 \cap Z=0) = ...$
We can deduce using the lemma that if $A \cap B$ are independent then $A \cap \bar{B}$ are also independent to show that the rest are independent
They're independent. ~~Source: trust me bro~~
<u>Definition:</u> A sequence of random variables $X_1,...,X_m$ are pairwise/mutually independent if for all $X_1,...,X_m \in \mathbb{R}$ the events $X_1=x_1, X_2=x_2,..., X_m=x_m$ are pairwise/mutually independent.
## Expected values
The expected value of a random variable $X$ is
$E(X) = \sum_{\omega \in S} Pr(\omega ) \times X(\omega ) = \sum_{x \in \mathbb{R}} Pr(X=x)\times x$
The sum of all of the random variables assigned to each outcome multiplied by the value of the probability of that outcome.
Also known as the mean, expectation, ~~average~~
Example: Roll a die $S=\{1,2,3,4,5,6\} Pr(\omega ) = \frac{1}{6}$
Random variable: $X(i)=i$
The random variable $X$ is equal to the result of the dice roll
$E(X) = \sum_{\omega \in S} Pr(\omega) \times X(\omega) = \frac{1}{6}\sum_{i=1}^6 i = \frac{1}{6} \times \frac{6 * 7}{2} = \frac{7}{2} = 3.5$
## Something difficult and painful
Rolling two die
$X(i,j) = i + j, Pr(\omega ) = \frac{1}{36}$
What is the expected value of rolling two die under this definition of $X$?
$E(X) = \sum_{\omega \in S} Pr(\omega) \times X(\omega)$
$= \sum_{i=1}^6 \sum_{j=1}^6 Pr(i,j) \times X(i,j) = \frac{1}{36} \times \sum_{i=1}^6 \sum_{j=1}^6 (i+j)$ (probability of outcome rolling is always the same)
$= \frac{1}{36} \sum_{i=1}^6(6i + \sum_{j=1}^6 j) = \frac{1}{36} \sum_{i=1}^6 (6i + \frac{6 * 7}{2})$
$= \frac{1}{36}(\frac{6 * 6 * 7}{2} + \frac{6 * 6 * 7}{2}) = \frac{7}{2} + \frac{7}{2} = 7$
Using the other definition of calculating expected values
$E(X) = \sum_{x \in \mathbb{R}} Pr(X=x) \times x$
$= \frac{1}{36} \times 2 + \frac{2}{36} \times 3 + \frac{3}{36} \times 4 + \frac{4}{36} \times 5 + \frac{5}{36} \times 6 + \frac{6}{36} \times 7 + \frac{5}{36} \times 8 + \frac{4}{36} \times 9$ +
$\frac{3}{36} \times 10 + \frac{2}{36} \times 11 + \frac{1}{36} \times 12 = 7$
## The linearity of expectation
Lemma (Linearity of expectation): For ***any*** two random variables $X$ and $Y$
$E(X + Y) = E(X) + E(Y)$
$"X+Y" = \text{A random variable } Z(\omega ) = X(\omega ) + Y(\omega )$
Ex:
$A(i, j) = i$
$B(i, j) = j$
$\tilde{X}(i,j) = i+j = A(i,j) + B(i,j)$
Like saying $\tilde{X} = A + B$
$E(\tilde{X}) = E(A+B) = E(A) + E(B)$
$E(A) = \sum_{i=1}^6\sum_{j=1}^6 Pr(i,j) \times A(i,j) = \frac{1}{36}\sum_{i=1}^6\sum_{j=1}^6 i$
$= \frac{1}{36} \frac{6 * 6 * 7}{2} = \frac{7}{2}$
$E(B) = ... \frac{7}{2}$
###### tags: `COMP2804` `Probability`