Commit Graph

1246 Commits

Author SHA1 Message Date
Roger Labbe
69cf5aebce I require that the cholesky return an upper triangular matrix,
but did not explain this in the text.
2016-03-26 10:09:51 -07:00
Roger Labbe
8fa2cf615c Regenerated a figure. 2016-03-26 09:38:51 -07:00
Roger Labbe
ba980de88f Typos. github issue #93. 2016-03-26 09:37:37 -07:00
Roger Labbe
57b420f919 Fixed equation for Gaussian mean.
Used incorrect subscript for variance in the numerator.
2016-03-26 09:27:43 -07:00
Roger Labbe
f7ee853583 Added example for 1D state vector.
I got a question about it, so tried to add an example to clarify it.
2016-03-26 09:16:10 -07:00
Roger Labbe
b492c258d2 Issue #88.
Misidentified the process matrices as being x, P, when they are
F, Q.
2016-03-15 17:03:05 -07:00
Roger Labbe
38996b6d24 Merge pull request #90 from Gluttton/master
Unscented Kalman Filter: using "state sigmas" instead of "prior sigmas" in the cross-covariance equation at the predict step
2016-03-15 17:00:54 -07:00
Gluttton
ec30b65a47 Using "prior sigmas" instead of "state sigmas" in the cross-covariance equation at the predict step. 2016-03-15 22:00:28 +02:00
Roger Labbe
c32ca48542 Moved legends inside of plot.
The %matplotlib notebook back end does not account for things being
drown outside of the plot, and they end up getting partially or
fully cut off.
2016-03-07 07:16:59 -08:00
Roger Labbe
6f1fd2f16f Updated to use absolute imports
I used to add .\code to the path, which was an absurd hack.
Now all code is imported with import code.foo.
2016-03-06 12:02:13 -08:00
Roger Labbe
fa62edccc4 Merge pull request #84 from Gluttton/master
Unscented Kalman Filter: using "state estimate" instead of "mean estimate" in the covariance equation at the predict step
2016-03-06 09:01:34 -08:00
Roger Labbe
f16b43c3b4 Reran with seaborn installed
I ran several notebooks after reinstalling anaconda but I hadn't
reinstalled seaborn, so the plots didn't have the right look.
2016-03-06 08:54:19 -08:00
Roger Labbe
fb21cf6313 Making book work in Python 2.7 2016-03-06 08:18:27 -08:00
Gluttton
35663774d7 Using "state estimate" instead of "mean estimate" in the covariance equation at the predict step. 2016-03-06 12:53:41 +02:00
Roger Labbe
f62fb8bbe8 Reran with tight_layout for interactive plots
This is just so everything looks nice in nbviewer. I added
plt.tight_layout() to the interactive_plot context manager,
which makes plots fill the output cell better.
2016-02-28 09:46:06 -08:00
Roger Labbe
2594d8905c Added animated filtering
Rewrote some of the chapter examples to use animated plots. Makes
it so much easier to see how the filter is performing.
2016-02-28 08:35:51 -08:00
Roger Labbe
a647f96388 Made plots interactive 2016-02-27 21:40:21 -08:00
Roger Labbe
ef67326af6 Change plots to interactive plots 2016-02-27 20:52:36 -08:00
Roger Labbe
f3358f44d6 Switched to interactive plotting. 2016-02-27 20:04:40 -08:00
Roger Labbe
590af94807 Switching to interactive plots 2016-02-27 18:55:34 -08:00
Roger Labbe
a6c2b0ccc9 Interactive plots with 5matplotlib inlien
Need to make the plots antimated, but the notebook is essentially
working.
2016-02-27 17:52:17 -08:00
Roger Labbe
6aea84f6b1 Switched to interactive plots!
Using %matplotlib notebook to render plots.

I made the g-h filter chapter work. There is a very good chance
I broke the other chapters. Need to push to really find out.
2016-02-27 17:10:09 -08:00
Roger Labbe
26cf805dc3 Explained std vs var in N(mu, var) formulation.
Some book use std, I use variance.
2016-02-18 08:50:12 -08:00
Roger Labbe
741d785e03 Explained biased vs unbiased estimators.
I had glossed over this difference, and so it would be confusing
to understand why my equations for VAR and COV are incorrect for
samples. I distinguished between the two and gave the correct
computations for each.
2016-02-18 08:35:12 -08:00
Roger Labbe
b15968e5b1 Touched upon conditioning of variables.
Pointed out that height of students probably has two means if the
population includes males and females. Did not go into Gaussian
mixtures or conditioning of the data.
2016-02-06 15:03:53 -08:00
Roger Labbe
5587dd0fda Completed IMM description
The text for the IMM filter was incomplete, and wrong in a few
places.
2016-02-06 15:03:19 -08:00
Roger Labbe
f6c83812a1 Github Issue #80 Invalid use of kappa
I used kappa in a couple of places for the square root, when I should
have been using lambda.
2016-02-05 07:51:43 -08:00
Roger Labbe
f64aec3693 Copy editing.
`
2016-01-31 20:13:06 -08:00
Roger Labbe
1610678354 Worked on the IMM section.
It is more complete, but not finished.
2016-01-31 20:12:29 -08:00
Roger Labbe
57dda86f18 Copy editing. 2016-01-30 08:29:22 -08:00
Roger Labbe
ac2c27119b Adde binder badge at top of readme
So user can know about the online version without having to wade
through a lot of introductory text.
2016-01-30 07:58:04 -08:00
Roger Labbe
230539a3fd Fixed underlining of \emph in PDF.
Thought I had this fixed, not sure why it came back. Had to add
\normalem to template file to force \emph to be italics, which is
the default. Not sure why it has been doing underlining recently.
2016-01-28 17:48:42 -08:00
Roger Labbe
15ad94a5b3 Merge branch 'master' of https://github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python 2016-01-27 21:21:54 -08:00
Roger Labbe
e5fa61bbc5 Added links to source material.
Linked (mostly to Wikipedia) to web material for terms that are
newly defined.
2016-01-27 21:19:16 -08:00
Roger Labbe
9ae110077d Merge pull request #77 from Gluttton/master
Trying to fix broken layout.
2016-01-26 12:08:07 -08:00
Gluttton
0bece0a770 Trying to fix broken layout. 2016-01-26 21:46:44 +02:00
Roger Labbe
aae7bbb368 Copy edits. 2016-01-24 18:52:31 -08:00
Roger Labbe
53b058bbda Copy edit of math chapter.
I moved the conversion of the multivariate equations to univariate
equations to the supporting textbook. It's not terribly necessary,
especially since I converted the univariate equations to look like
the multivariate ones.
2016-01-24 15:34:57 -08:00
Roger Labbe
71da22ad8c Fixed eqn alignment issues. overline to bar.
Some equations used \\ without a gathered or aligned block.
They render fine in the notebook, not not in the PDF.

Also, switched back my ill chosen use of \overline for \bar.
2016-01-24 14:08:07 -08:00
Roger Labbe
d5ea503fde More copy edits, updated for filterpy 0.1.2 2016-01-24 12:11:39 -08:00
Roger Labbe
49f604be52 Switched to stix fonts for PDF. 2016-01-23 08:37:59 -08:00
Roger Labbe
6ade9592ec Whitespace change to test git push
I got an error on my last push after some nasty Dropbox
befouling of my files. This is just to give me something to push
that isn't important.
2016-01-23 07:25:10 -08:00
Roger Labbe
c268a09f5f Rewrites for orthogonal version of eqns
The changes in the univariate chapter, where I derived the
eqn for K continue to ripple through the chapters.
2016-01-23 07:18:43 -08:00
Roger Labbe
d6becd7428 Ran notebooks to reflect css changes. 2016-01-18 18:41:39 -08:00
Roger Labbe
856775e906 Major rewrites due to discrete bayes changes.
I've derived the x + Ky form for the univariate kalman filter.
I completely reordered material, cutting about 10 pages (pdf)
of material. I made the connection between the bayesian form
and orthogonal form more explicit.

Probably there are a lot of grammatical errors, but I wanted to get
these checked in.

I also altered the css - mainly the font.
2016-01-18 18:16:20 -08:00
Roger Labbe
5240944dd4 Ran all notebooks to use new css settings. 2016-01-17 20:44:23 -08:00
Roger Labbe
0a41e78aeb Added likelihood and orthogonal projections
Added the likelihood equations/form from the discrete bayes
chapter to better tie in that form of reasoning. then I converted
the 1d equations to the orthogonal projection form to show how
the Kalman gain is computed and where the residual comes from
computationally. This should make the full KF equations much more
approachable.
2016-01-17 20:16:27 -08:00
Roger Labbe
005fe0618c Edits for conciseness. 2016-01-17 12:36:02 -08:00
Roger Labbe
daf7ae26e6 Reexecuted to incorporate css changes. 2016-01-17 12:06:31 -08:00
Roger Labbe
d0b4a1f4bc Generalized discrete Bayes with likelihood.
All my code in this chapter hard coded the computation of the
likelihood inside the update() function, where it had no business.
Also, my treatment of the likelihood was rather hand wavey. By
pulling it out of update() and maing it explicit I have created
a firm foundation for the rest of the book.
2016-01-17 12:02:00 -08:00