or
or
By clicking below, you agree to our terms of service.
New to HackMD? Sign up
Syntax | Example | Reference | |
---|---|---|---|
# Header | Header | 基本排版 | |
- Unordered List |
|
||
1. Ordered List |
|
||
- [ ] Todo List |
|
||
> Blockquote | Blockquote |
||
**Bold font** | Bold font | ||
*Italics font* | Italics font | ||
~~Strikethrough~~ | |||
19^th^ | 19th | ||
H~2~O | H2O | ||
++Inserted text++ | Inserted text | ||
==Marked text== | Marked text | ||
[link text](https:// "title") | Link | ||
 | Image | ||
`Code` | Code |
在筆記中貼入程式碼 | |
```javascript var i = 0; ``` |
|
||
:smile: | ![]() |
Emoji list | |
{%youtube youtube_id %} | Externals | ||
$L^aT_eX$ | LaTeX | ||
:::info This is a alert area. ::: |
This is a alert area. |
On a scale of 0-10, how likely is it that you would recommend HackMD to your friends, family or business associates?
Please give us some advice and help us improve HackMD.
Do you want to remove this version name and description?
Syncing
# Notes for Probabilistic Robotics
Notes for Probabilistic Robotics
Probability review
Let \(X\) be a r.v.
Conditional Probability
Theorem of Total Probability
Bayes Rule
In probabilistic robotics, we want to infer a quantity based on an input. If \(x\) is what we want to infer from \(y\), then
\(p(x)\) is referred as prior probability distribution and
\(y\) is called the data (eg. sensor measurements), where
\(p(x\ |\ y)\) is called the posterior probability distribution.
\(p(y\ |\ x)\) is often coined generative model as it describes how state variables \(X\) cause sensor measurement \(Y\).
It is important to note that the denominator \(p(y)\) does not depend on \(x\). Therefore it is often written as a normalizer, \(\eta\).
Conditional Bayes rule on \(Z=z\):