From 70858d0e342ffff4c9cca901ccd908f662979e03 Mon Sep 17 00:00:00 2001 From: Roger Labbe Date: Wed, 29 Aug 2018 19:39:14 -0700 Subject: [PATCH] More improvements to Bayes' theorem coverage. Mostly rearranging descriptions from chapter 2 to 3, after I cover probability distributions. I think some more work needs to be done, mainly I think I go on for too long on pdfs, using first cars, then temperatures, to explain the same thing. --- 02-Discrete-Bayes.ipynb | 8 +- 03-Gaussians.ipynb | 289 ++++++++++++++++++++++++++-------------- table_of_contents.ipynb | 10 +- 3 files changed, 198 insertions(+), 109 deletions(-) diff --git a/02-Discrete-Bayes.ipynb b/02-Discrete-Bayes.ipynb index 752bc0f..1625c57 100644 --- a/02-Discrete-Bayes.ipynb +++ b/02-Discrete-Bayes.ipynb @@ -1703,15 +1703,15 @@ "\n", "We implemented the `update()` function with this probability calculation:\n", "\n", - "$$ \\mathtt{posterior} = \\frac{\\mathtt{likelihood}\\times \\mathtt{prior}}{\\mathtt{normalization}}$$ \n", + "$$ \\mathtt{posterior} = \\frac{\\mathtt{likelihood}\\times \\mathtt{prior}}{\\mathtt{normalization\\, factor}}$$ \n", "\n", - "We haven't developed the mathematics to discuss Bayes yet, but this is Bayes' theorem. Every filter in this book is an expression of Bayes theorem. In the next chapter we will develop the mathematics, but in many ways that obscures the simple idea expressed in this equation:\n", + "We haven't developed the mathematics to discuss Bayes yet, but this is Bayes' theorem. Every filter in this book is an expression of Bayes' theorem. In the next chapter we will develop the mathematics, but in many ways that obscures the simple idea expressed in this equation:\n", "\n", "$$ updated\\,knowledge = \\big\\|likelihood\\,of\\,new\\,knowledge\\times prior\\, knowledge \\big\\|$$\n", "\n", "where $\\| \\cdot\\|$ expresses normalizing the term.\n", "\n", - "We came to this with simple reasoning about a dog walking down a hallway. Yet, as we will see, the same equation applies to a universe of filtering problems. We will use this equation in every subsequent chapter.\n", + "We came to this with simple reasoning about a dog walking down a hallway. Yet, as we will see the same equation applies to a universe of filtering problems. We will use this equation in every subsequent chapter.\n", "\n", "Likewise, the `predict()` step computes the total probability of multiple possible events. This is known as the *Total Probability Theorem* in statistics, and we will also cover this in the next chapter after developing some supporting math.\n", "\n", @@ -1792,7 +1792,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.6.5" + "version": "3.6.4" }, "widgets": { "application/vnd.jupyter.widget-state+json": { diff --git a/03-Gaussians.ipynb b/03-Gaussians.ipynb index 78fc8df..afc8af8 100644 --- a/03-Gaussians.ipynb +++ b/03-Gaussians.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Gaussian Probabilities" + "# Probabilities, Gaussians, and Bayes' Theorem" ] }, { @@ -36,7 +36,7 @@ "