- Generative Adversarial Networks Projects
- Kailash Ahirwar
- 107字
- 2021-07-02 13:38:44
Kullback-Leibler divergence
Kullback-Leibler divergence (KL divergence), also known as relative entropy, is a method used to identify the similarity between two probability distributions. It measures how one probability distribution p diverges from a second expected probability distribution q.
The equation used to calculate the KL divergence between two probability distributions p(x) and q(x) is as follows:

The KL divergence will be zero, or minimum, when p(x) is equal to q(x) at every other point.
Due to the asymmetric nature of KL divergence, we shouldn't use it to measure the distance between two probability distributions. It is therefore should not be used as a distance metric.
推薦閱讀
- 玩轉智能機器人程小奔
- 大數據戰爭:人工智能時代不能不說的事
- Practical Ansible 2
- R Machine Learning By Example
- 21天學通C++
- Windows程序設計與架構
- 自動生產線的拆裝與調試
- DevOps:Continuous Delivery,Integration,and Deployment with DevOps
- 人工智能實踐錄
- Implementing Splunk 7(Third Edition)
- Enterprise PowerShell Scripting Bootcamp
- 軟件工程及實踐
- 工業自動化技術實訓指導
- 生物3D打印:從醫療輔具制造到細胞打印
- 傳感器與自動檢測