Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Clarify definition of cross-entropy metric in the documentation (clea…
Browse files Browse the repository at this point in the history
…n up PR #7291) (#7365)

* [R] switch order of LRN and pooling layer

Original paper (section 3.5) performs local response normalization of relu.

* clearify definition of cross entropy

* fix small type

* fixed lint, trailing wspace
  • Loading branch information
hesseltuinhof authored and piiswrong committed Aug 9, 2017
1 parent 251ae71 commit 44c2bfe
Showing 1 changed file with 6 additions and 2 deletions.
8 changes: 6 additions & 2 deletions python/mxnet/metric.py
Original file line number Diff line number Diff line change
Expand Up @@ -854,10 +854,14 @@ def update(self, labels, preds):
class CrossEntropy(EvalMetric):
"""Computes Cross Entropy loss.
The cross entropy is given by
The cross entropy over a batch of sample size :math:`N` is given by
.. math::
-y\\log \\hat{y} + (1-y)\\log (1-\\hat{y})
-\\sum_{n=1}^{N}\\sum_{k=1}^{K}t_{nk}\\log (y_{nk}),
where :math:`t_{nk}=1` if and only if sample :math:`n` belongs to class :math:`k`.
:math:`y_{nk}` denotes the probability of sample :math:`n` belonging to
class :math:`k`.
Parameters
----------
Expand Down

0 comments on commit 44c2bfe

Please sign in to comment.