More work on multivariate gaussians.

Not copy edited yet, but more work on making the material on
multivariate gaussian more understandable (mostly by reordering
concepts).
This commit is contained in:
Roger Labbe 2015-11-25 12:39:15 -08:00
parent a57fdfeaa9
commit 7cd8e11b57
4 changed files with 379 additions and 585 deletions

File diff suppressed because one or more lines are too long

View File

@ -279,9 +279,9 @@
"source": [
"If you've gotten this far I hope that you are thinking that the Kalman filter's fearsome reputation is somewhat undeserved. Sure, I hand waved some equations away, but I hope implementation has been fairly straightforward for you. The underlying concept is quite straightforward - take two measurements, or a measurement and a prediction, and choose the output to be somewhere between the two. If you believe the measurement more your guess will be closer to the measurement, and if you believe the prediction is more accurate your guess will lie closer it it. That's not rocket science (little joke - it is exactly this math that got Apollo to the moon and back!). \n",
"\n",
"To be honest I have been choosing my problems carefully. For any arbitrary problem finding some of the matrices that we need to feed into the Kalman filter equations can be quite difficult. I haven't been *too tricky*, though. Equations like Newton's equations of motion can be trivially computed for Kalman filter applications, and they make up the bulk of the kind of problems that we want to solve. \n",
"To be honest I have been choosing my problems carefully. For an arbitrary problem designing the Kalman filter matrices can be extremely difficult. I haven't been *too tricky*, though. Equations like Newton's equations of motion can be trivially computed for Kalman filter applications, and they make up the bulk of the kind of problems that we want to solve. \n",
"\n",
"I have strived to illustrate concepts with code and reasoning, not math. But there are topics that do require more mathematics than I have used so far. In this chapter I will give you the math behind the topics that we have learned so far, and introduce the math that you will need to understand the topics in the rest of the book. Many topics are optional."
"I have illustrated the concepts with code and reasoning, not math. But there are topics that do require more mathematics than I have used so far. This chapter presents the math that you will need for the rest of the book."
]
},
{
@ -290,16 +290,16 @@
"source": [
"## Computing Covariances\n",
"\n",
"You will not need to do this by hand for the rest of the book, but it is important to understand the computations behind these values. It only takes a few minutes to learn."
"You will not need to compute covariances by hand, but it is important to understand the computation. It only takes a few minutes to learn."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The **covariance** measures how much two random variables move in the same direction, and is defined as\n",
"The *covariance* measures how much two random variables move in the same direction, and is defined as\n",
"\n",
"$$ COV(x,y) = \\frac{1}{N}\\sum_{i=1}^N (x_i - \\mu_x)(y_i - \\mu_y)$$\n",
"$$ COV(x,y) = E[(X-E[X])(Y-E[Y])] = \\frac{1}{N}\\sum_{i=1}^N (x_i - \\mu_x)(y_i - \\mu_y)$$\n",
"\n",
"If we compare this to the formula for the variance of a single variable we can see where this definition came from.\n",
"\n",
@ -2480,7 +2480,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.4.3"
"version": "3.5.0"
}
},
"nbformat": 4,

View File

@ -37,6 +37,24 @@ def plot_height_std(x, lw=10):
plt.show()
def plot_correlated_data(X, Y, xlabel=None,
ylabel=None, equal=True):
plt.scatter(X, Y)
if xlabel is not None:
plt.xlabel('Height (in)');
if ylabel is not None:
plt.ylabel('Weight (lbs)')
# fit line through data
m, b = np.polyfit(X, Y, 1)
plt.plot(X, np.asarray(X)*m + b,color='k')
if equal:
plt.gca().set_aspect('equal')
plt.show()
def plot_gaussian (mu, variance,
mu_line=False,
xlim=None,

View File

@ -404,12 +404,16 @@ def plot_3_covariances():
P = [[2, 0], [0, 2]]
plt.subplot(131)
plt.gca().grid(b=False)
plt.gca().set_xticks([0,1,2,3,4])
plot_covariance_ellipse((2, 7), cov=P, facecolor='g', alpha=0.2,
title='|2 0|\n|0 2|', axis_equal=False)
plt.ylim((4, 10))
plt.gca().set_aspect('equal', adjustable='box')
plt.subplot(132)
plt.gca().grid(b=False)
plt.gca().set_xticks([0,1,2,3,4])
P = [[2, 0], [0, 9]]
plt.ylim((4, 10))
plt.gca().set_aspect('equal', adjustable='box')
@ -417,6 +421,8 @@ def plot_3_covariances():
axis_equal=False, title='|2 0|\n|0 9|')
plt.subplot(133)
plt.gca().grid(b=False)
plt.gca().set_xticks([0,1,2,3,4])
P = [[2, 1.2], [1.2, 2]]
plt.ylim((4, 10))
plt.gca().set_aspect('equal', adjustable='box')
@ -437,7 +443,7 @@ def plot_correlation_covariance():
plt.gca().autoscale(tight=True)
plt.axvline(7.5, ls='--', lw=1)
plt.axhline(12.5, ls='--', lw=1)
plt.scatter(7.5, 12.5, s=2000, alpha=0.5)
plt.scatter(7.5, 12.5, s=1500, alpha=0.5)
plt.title('|4.0 3.9|\n|3.9 4.0|')
plt.show()