The Kullback Leibler Divergence has a closed form solution for two Gaussians. Here I will shortly derive this solution. If you need an intuitive refresher of the KL divergence you can have a look at a previous post of mine.
Assume two Gaussian distributions
The Kullback-Leibler Divergence between two Gaussians is defined as
The log of a Gaussian is
Let’s simplify the first term of equation (1).
In line 7 we used .
Now we need to simplify the second term of equation (1).
Let’s now just take a look at the expectation of the last equation
In line 2 we used the definition of variance
In line 4 we used the binomial theorem.
Now let’s insert the result back so that we get
Let’s check whether this reduces to zero if we compare the same two Gaussian distributions.
If then and , so that