Merge pull request #183 from SOVIETIC-BOSS88/patch-16

Update 17_foundations.ipynb Small typo
This commit is contained in:
Jeremy Howard 2020-08-16 07:18:00 -07:00 committed by GitHub
commit 5e169675cf
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -1774,7 +1774,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"We've seen that PyTorch computes all the gradient we need with a magic call to `loss.backward`, but let's explore what's happening behind the scenes.\n",
"We've seen that PyTorch computes all the gradients we need with a magic call to `loss.backward`, but let's explore what's happening behind the scenes.\n",
"\n",
"Now comes the part where we need to compute the gradients of the loss with respect to all the weights of our model, so all the floats in `w1`, `b1`, `w2`, and `b2`. For this, we will need a bit of math—specifically the *chain rule*. This is the rule of calculus that guides how we can compute the derivative of a composed function:\n",
"\n",