Fix ch05 typos. (#287)

This commit is contained in:
Jakub Duchniewicz 2020-10-12 11:00:31 +02:00 committed by GitHub
parent b794e9bd45
commit a333dfd5e9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -987,7 +987,7 @@
"\n",
"Intuitively, the softmax function *really* wants to pick one class among the others, so it's ideal for training a classifier when we know each picture has a definite label. (Note that it may be less ideal during inference, as you might want your model to sometimes tell you it doesn't recognize any of the classes that it has seen during training, and not pick a class because it has a slightly bigger activation score. In this case, it might be better to train a model using multiple binary output columns, each using a sigmoid activation.)\n",
"\n",
"Softmax is the first part of the cross-entropy loss—the second part is log likeklihood. "
"Softmax is the first part of the cross-entropy loss—the second part is log likelihood. "
]
},
{