Derivation of the Differential Entropy of a Gaussian Random Vector
Assume is an -dimensional Gaussian random vector, denoted as , where is the mean vector and is the symmetric positive-definite covariance matrix. The multivariate probability density function (PDF) is defined as:
The differential entropy is derived by evaluating the expected value of the negative log-likelihood:
To evaluate the expectation of the quadratic form, the Trace Trick is applied. Since the quadratic form is a scalar, it is equal to its own trace ():
Substituting this result back into the entropy expression:
The final expression for the differential entropy of an -dimensional Gaussian vector is:
Note
This result highlights that the entropy of a multivariate Gaussian depends solely on the dimension and the determinant of the covariance matrix , reinforcing the principle that mean shifts () do not affect the information content.