Conditional Entropy

The conditional entropy measures the the amount of information needed to describe the outcome of a random variable \(Y\) given that the value of another random variable \(X\) is known. The entropy of \(Y\) conditioned on \(X\) is defined as \[ H(Y|X) = -\sum_{(x,y) \in \mathcal{X} \times \mathcal{Y}} p_{(X,Y)}(x,y) \log \frac{p_{(X,Y)}(x,y)}{p_X(x)} \]

Previous
Next