Jensen-Shannon Divergence

#Information Theory #Divergence

The Jensen-Shannon divergence is a symmetric divergence of distributions $P$ and $Q$,

$$ \operatorname{D}_{\text{JS}} = \frac{1}{2} \left[ \operatorname{D}_{\text{KL}} \left(P \bigg\Vert \frac{P+Q}{2} \right) + \operatorname{D}_{\text{KL}} \left(Q \bigg\Vert \frac{P+Q}{2}\right) \right], $$

where $\operatorname{D}_{\text{KL}}$ is the KL Divergence KL Divergence Kullback–Leibler divergence indicates the differences between two distributions .

Published: by ;

Lei Ma (2021). 'Jensen-Shannon Divergence', Datumorphism, 09 April. Available at: https://datumorphism.leima.is/cards/information/jensen-shannon-divergence/.

Current Ref:

  • cards/information/jensen-shannon-divergence.md