More improvements to Bayes' theorem coverage.
Mostly rearranging descriptions from chapter 2 to 3, after I cover probability distributions. I think some more work needs to be done, mainly I think I go on for too long on pdfs, using first cars, then temperatures, to explain the same thing.
This commit is contained in:
parent
3270a5b686
commit
70858d0e34
@ -1703,15 +1703,15 @@
|
||||
"\n",
|
||||
"We implemented the `update()` function with this probability calculation:\n",
|
||||
"\n",
|
||||
"$$ \\mathtt{posterior} = \\frac{\\mathtt{likelihood}\\times \\mathtt{prior}}{\\mathtt{normalization}}$$ \n",
|
||||
"$$ \\mathtt{posterior} = \\frac{\\mathtt{likelihood}\\times \\mathtt{prior}}{\\mathtt{normalization\\, factor}}$$ \n",
|
||||
"\n",
|
||||
"We haven't developed the mathematics to discuss Bayes yet, but this is Bayes' theorem. Every filter in this book is an expression of Bayes theorem. In the next chapter we will develop the mathematics, but in many ways that obscures the simple idea expressed in this equation:\n",
|
||||
"We haven't developed the mathematics to discuss Bayes yet, but this is Bayes' theorem. Every filter in this book is an expression of Bayes' theorem. In the next chapter we will develop the mathematics, but in many ways that obscures the simple idea expressed in this equation:\n",
|
||||
"\n",
|
||||
"$$ updated\\,knowledge = \\big\\|likelihood\\,of\\,new\\,knowledge\\times prior\\, knowledge \\big\\|$$\n",
|
||||
"\n",
|
||||
"where $\\| \\cdot\\|$ expresses normalizing the term.\n",
|
||||
"\n",
|
||||
"We came to this with simple reasoning about a dog walking down a hallway. Yet, as we will see, the same equation applies to a universe of filtering problems. We will use this equation in every subsequent chapter.\n",
|
||||
"We came to this with simple reasoning about a dog walking down a hallway. Yet, as we will see the same equation applies to a universe of filtering problems. We will use this equation in every subsequent chapter.\n",
|
||||
"\n",
|
||||
"Likewise, the `predict()` step computes the total probability of multiple possible events. This is known as the *Total Probability Theorem* in statistics, and we will also cover this in the next chapter after developing some supporting math.\n",
|
||||
"\n",
|
||||
@ -1792,7 +1792,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.5"
|
||||
"version": "3.6.4"
|
||||
},
|
||||
"widgets": {
|
||||
"application/vnd.jupyter.widget-state+json": {
|
||||
|
File diff suppressed because one or more lines are too long
@ -24,7 +24,7 @@
|
||||
"\n",
|
||||
"Introduces the discrete Bayes filter. From this you will learn the probabilistic (Bayesian) reasoning that underpins the Kalman filter in an easy to digest form.\n",
|
||||
"\n",
|
||||
"[**Chapter 3: Gaussian Probabilities**](http://nbviewer.ipython.org/urls/raw.github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python/master/03-Gaussians.ipynb)\n",
|
||||
"[**Chapter 3: Probabilities, Gaussians, and Bayes' Theorem**](http://nbviewer.ipython.org/urls/raw.github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python/master/03-Gaussians.ipynb)\n",
|
||||
"\n",
|
||||
"Introduces using Gaussians to represent beliefs in the Bayesian sense. Gaussians allow us to implement the algorithms used in the discrete Bayes filter to work in continuous domains.\n",
|
||||
"\n",
|
||||
@ -149,9 +149,7 @@
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"metadata": {
|
||||
"collapsed": false
|
||||
},
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
@ -417,9 +415,9 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.5.1"
|
||||
"version": "3.6.5"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0
|
||||
"nbformat_minor": 1
|
||||
}
|
||||
|
Loading…
Reference in New Issue
Block a user