Cross Entropy

The cross entropy between two distributions over the same underlying set of events measures the average number of bits to identify the event drawn from the set if a coding scheme is used for the set is optimized for probability distribution \(q\), instead of the true distribution \(p\).

The Cross Entropy of distribution \(q\) relative to a distribution \(p\) is defined as \[ H(p||q) = -\mathrm{E}_{x \sim p} \log q(x) \]

Previous
Next