From a333dfd5e9ba76c99460411b15e1d0e6b5b0dda2 Mon Sep 17 00:00:00 2001 From: Jakub Duchniewicz Date: Mon, 12 Oct 2020 11:00:31 +0200 Subject: [PATCH] Fix ch05 typos. (#287) --- 05_pet_breeds.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/05_pet_breeds.ipynb b/05_pet_breeds.ipynb index a6b68ad..866049f 100644 --- a/05_pet_breeds.ipynb +++ b/05_pet_breeds.ipynb @@ -987,7 +987,7 @@ "\n", "Intuitively, the softmax function *really* wants to pick one class among the others, so it's ideal for training a classifier when we know each picture has a definite label. (Note that it may be less ideal during inference, as you might want your model to sometimes tell you it doesn't recognize any of the classes that it has seen during training, and not pick a class because it has a slightly bigger activation score. In this case, it might be better to train a model using multiple binary output columns, each using a sigmoid activation.)\n", "\n", - "Softmax is the first part of the cross-entropy loss—the second part is log likeklihood. " + "Softmax is the first part of the cross-entropy loss—the second part is log likelihood. " ] }, {