Definition
The Conditional Entropy quantifies the average uncertainty remaining about after has been observed. It is defined as the expected value of the conditional information content:
Alternatively, is the weighted average of the entropies of conditioned on each possible value of :
This result confirms that the conditional entropy is the expected value of the entropy of the conditional distribution , averaged over all possible values of .
Proof 1: Algebraic Expansion
By expanding the joint probability , the definition is rewritten as:
Proof 2: Law of Iterated Expectations
A more concise proof utilizes the property that for any function , the global expectation is the expectation of the conditional expectations, .
Let . Then:
For a fixed , the inner expectation represents the entropy of the conditional distribution :
Substituting this back into the outer expectation yields: