The Kullback Leibler Divergence has a closed form solution for two Gaussians. Here I will shortly derive this solution. If you need an intuitive refresher of the KL divergence you can have a look at a previous post of mine.
Assume two Gaussian distributions
–
and
–
The Kullback-Leibler Divergence between two Gaussians is defined as
Preparation
The log of a Gaussian is
Term 1
Let’s simplify the first term of equation (1).
In line 7 we used .
Term 2
Now we need to simplify the second term of equation (1).
Let’s now just take a look at the expectation of the last equation
In line 2 we used the definition of variance
In line 4 we used the binomial theorem.
Now let’s insert the result back so that we get
Final
Finally we can insert the results from equation (2) and equation (3) into equation (1).
Check
Let’s check whether this reduces to zero if we compare the same two Gaussian distributions.
If then and , so that
and finally
Comments are closed, but trackbacks and pingbacks are open.