# Week 2 Lecture 2
## Random Variable
For a given sample space Ω of some experiment, a **random variable** is any rule that associates a real number with each outcome in Ω.
A random variable is a **real-valued** function (its range is within the set of real numbers) whose domain is the sample space.

## Discrete and Continuous RV
A random variable is called **discrete** if X can take only a finite number *k* of different values x~1~,...,x~k~, or, at most, an infinite sequence of different values x~1~,x~2~,x~3~,...
A random variable X is called continuous if the following conditions hold:
* The set of all possible values for X is either a single interval on the real line, or a union of disjoint intervals on the real line.
* No possible value has positive probability: $Pr(X=x) = 0$ for any possible $x$
:::info
P(X=x) means probability of random variable X to take the value x.
:::
### Examples
#### Discrete RV

#### Continuous RV


#### Question

There is a discrete component, which is the element 2, and a continuous component, which is [0,1]
## Probability Distribution
Let X be a random variable defined on the sample space Ω.
Thus, X(ω) is a real number for every outcome ω ∈ Ω.
Let C be a subset of the real line. We write {X ∈ C} to mean the set {ω ∈ Ω: X(ω) ∈ C} of all outcomes whose X-value is in C.

:::info
An event is a subset of outcomes contained in Ω. Hence, {X ∈ C} is an event.
:::
Every event is assigned a probability, so it makes sense to consider the probability of {X ∈ C}:
$Pr(X \in C ) = Pr(\{\omega\in\Omega:X(\omega)\in C\})$
The probability distribution of X is the collection of all probabilities of the form Pr(X∈C), for all sets C of real numbers.
In other word, for any set C of real numbers, the probability distribution of X gives the probability Pr(X∈C) of how likely the random variable X takes on values in C.
## Probability Mass Distribution (pmf)
Let X be a discrete RV defined on the sample space Ω.
The pmf of X is a function p(x) defined on every real number, such that
$$
\begin{eqnarray*}
p(x) &=& Pr(X=x) \\
&=& Pr({\omega\in\Omega:X(\omega)=x})
\end{eqnarray*}
$$
The pmf of X completely describes the distribution of X, since
$$
Pr(X\in C)=\sum_{x\in C}p(x).
$$
### Example

### Bernoulli Distribution
An RV X is called a Bernoulli RV if it takes only 2 values: 0 and 1.
The pmf of X is given by
$Pr(X=1)=p$,
$Pr(X=0)=1-p$, for some 0 ≤ p ≤ 1.
We say that X is the Bernoulli R.V. with parameter p.

### Bernoulli Process and Binomial RV
Consider an experiment (a Bernoulli process) with n repeated trials, such that:
* The trials are independent
* Each trial has only 2 outcomes: 1(success) and 0 (failure)
* The success rate of the trials is the same (denoted by some probability p).
The pmf of binomial RV is given by:

where n is the number of trials, and p is the success rate of each trial.

### Examples



## Probability Density Distribution (pdf)
The pdf of a continuous RV X is a function f(x) such that for any two numbers a and b with a ≤ b,
$$
P(a\le X \le b)= \int_a^b f(x) dx
$$
The pdf completely describes the distribution of X.

This graph is also known as the density curve.
For a function f(x) to be a valid pdf of some continuous RV, it must satisfy the following conditions:
1. $f(x) \ge 0$ for al x. (Density cannot be negative)
2. $\int_{-\infty}^{\infty}f(x) dx= 1$ (Area under the curve of f(x) is 1)

### Uniform Distribution

### Exponential Distribution

### Examples



<hr/>

<hr/>
