- Generative Adversarial Networks Projects
- Kailash Ahirwar
- 107字
- 2021-07-02 13:38:44
Kullback-Leibler divergence
Kullback-Leibler divergence (KL divergence), also known as relative entropy, is a method used to identify the similarity between two probability distributions. It measures how one probability distribution p diverges from a second expected probability distribution q.
The equation used to calculate the KL divergence between two probability distributions p(x) and q(x) is as follows:

The KL divergence will be zero, or minimum, when p(x) is equal to q(x) at every other point.
Due to the asymmetric nature of KL divergence, we shouldn't use it to measure the distance between two probability distributions. It is therefore should not be used as a distance metric.
推薦閱讀
- 高效能辦公必修課:Word圖文處理
- ETL with Azure Cookbook
- Learning Social Media Analytics with R
- 物聯(lián)網(wǎng)與云計(jì)算
- 電腦上網(wǎng)直通車
- Maya 2012從入門到精通
- 80x86/Pentium微型計(jì)算機(jī)原理及應(yīng)用
- Learning Azure Cosmos DB
- Deep Reinforcement Learning Hands-On
- 智能生產(chǎn)線的重構(gòu)方法
- DevOps Bootcamp
- Salesforce Advanced Administrator Certification Guide
- Excel 2010函數(shù)與公式速查手冊(cè)
- Mastering MongoDB 3.x
- 大數(shù)據(jù)案例精析