Update 17_foundations.ipynb

This commit is contained in:
SOVIETIC-BOSS88 2020-06-03 22:08:32 +02:00 committed by GitHub
parent 62ac21d085
commit 7d2ae8e167
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -1774,7 +1774,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"We've seen that PyTorch computes all the gradient we need with a magic call to `loss.backward`, but let's explore what's happening behind the scenes.\n",
"We've seen that PyTorch computes all the gradients we need with a magic call to `loss.backward`, but let's explore what's happening behind the scenes.\n",
"\n",
"Now comes the part where we need to compute the gradients of the loss with respect to all the weights of our model, so all the floats in `w1`, `b1`, `w2`, and `b2`. For this, we will need a bit of math—specifically the *chain rule*. This is the rule of calculus that guides how we can compute the derivative of a composed function:\n",
"\n",