Press "Enter" to skip to content

Category: General

Closed Form Solution of Kullback Leibler Divergence between two Gaussians

The Kullback Leibler Divergence has a closed form solution for two Gaussians. Here I will shortly derive this solution. If you need an intuitive refresher of the KL divergence you can have a look at a previous post of mine. Assume two Gaussian distributions – $q(x) \sim \mathcal{N}(\mu_q, \sigma_q)$ and…

Comments closed

Intuitive Explanation of the Kullback-Leibler Divergence

In this post I would like to write and give some intuition about the Kullback-Leibler Divergence, which is a measure of how different two probability distributions over the same random variable are. I’ll start by giving an intuitive explanation of the entropy and then derive the Kullback-Leibler Divergence from it.…

Comments closed