Fix ch05 typos. (#287)
This commit is contained in:
parent
b794e9bd45
commit
a333dfd5e9
@ -987,7 +987,7 @@
|
||||
"\n",
|
||||
"Intuitively, the softmax function *really* wants to pick one class among the others, so it's ideal for training a classifier when we know each picture has a definite label. (Note that it may be less ideal during inference, as you might want your model to sometimes tell you it doesn't recognize any of the classes that it has seen during training, and not pick a class because it has a slightly bigger activation score. In this case, it might be better to train a model using multiple binary output columns, each using a sigmoid activation.)\n",
|
||||
"\n",
|
||||
"Softmax is the first part of the cross-entropy loss—the second part is log likeklihood. "
|
||||
"Softmax is the first part of the cross-entropy loss—the second part is log likelihood. "
|
||||
]
|
||||
},
|
||||
{
|
||||
|
Loading…
Reference in New Issue
Block a user