Notes for Probabilistic Robotics

Probability review

Let \(X\) be a r.v.

  • \(p(X=x)\)
  • Probabilities always non-negative: \(\ p(X=x)\geq0\)
  • Normal (Gaussian) dist.: \(N(x;\ \mu,\sigma^2)\)
  • pdf: \(p(x)\ =\ (2\pi\sigma^2)^{-1/2}\ \exp\{-\frac{1}{2}\frac{(x-\mu)^2}{\sigma^2}\}\)
  • multivariate pdf: \(p(x)\ =\ \det(2\pi\sum)^{-1/2}\ \exp\{-\frac{1}{2}(x-\mu)^T\ \sum^{-1}({x-\mu})\}\)
  • Integral of pdf must be 1 = \(\int{\ p(x)}\ dx = 1\)
  • joint dist.: \(p(x\ ,\ y)\ =\ p(X=x\ \ and\ \ Y=y )\)
  • independence: \(p(x\ ,\ y)\ =\ p(x)\ p(y)\)
Conditional Probability
  • \(p(x\ |\ y)\ =\ p(X=x\ |\ Y=y)\)
    • if \(p(y)>0\),   \(p(x\ |\ y)\ =\ \dfrac{p(x\ ,\ y)}{p(y)}\)
    • if \(X,Y\) independent, \(p(x\ |\ y)\ =\ \dfrac{p(x)\ p(y)}{p(y)}\ =\ p(x)\)
Theorem of Total Probability
  • \(p(x)\ =\ \sum\limits_{y}{\ p(x\ |\ y)\ p(y)}\) (discrete)
  • \(p(x)\ =\ \int{\ p(x\ |\ y)\ p(y)}\ dy\) (continuous)
Bayes Rule
  • \(\begin{split} p(x|y)\ =\ \dfrac{p(y\ |\ x)\ p(x)}{p(y)} &=\ \dfrac{p(y\ |\ x)\ p(x)}{\sum_{x'}\ p(y\ |\ x')\ p(x')}\ \ (discrete) \\ &= \ \dfrac{p(y\ |\ x)\ p(x)}{\int{\ p(y\ |\ x')\ p(x')}\ dx'}\ \ (continuous)\end{split}\)

In probabilistic robotics, we want to infer a quantity based on an input. If \(x\) is what we want to infer from \(y\), then

\(p(x)\) is referred as prior probability distribution and

\(y\) is called the data (eg. sensor measurements), where

\(p(x\ |\ y)\) is called the posterior probability distribution.

Bayes rule provides a convenient way for us to compute a posterior using the "inverse" probability \(p(y\ |\ x)\) along with the prior.

\(p(y\ |\ x)\) is often coined generative model as it describes how state variables \(X\) cause sensor measurement \(Y\).

It is important to note that the denominator \(p(y)\) does not depend on \(x\). Therefore it is often written as a normalizer, \(\eta\).

  • \(p(x\ |\ y)\ =\ \eta\ p(y\ |\ x)\ p(x)\)

Conditional Bayes rule on \(Z=z\):

  • \(p(x\ |\ y)\ =\ \dfrac{p(y\ |\ x,z)\ p(x\ |\ z)}{p(y\ |\ z)}\), \(p(y\ |\ z)\ >\ 0\)
  • The idea is there exists a third factor that cause the correlation between the two event. This third factor is called a confounder.
  • i.e. Future Past | Present (Markov assumption)
  • More notes: TowardsDataScience
Select a repo