官术网_书友最值得收藏!

Kullback-Leibler divergence

Kullback-Leibler divergence (KL divergence), also known as relative entropy, is a method used to identify the similarity between two probability distributions. It measures how one probability distribution p diverges from a second expected probability distribution q

The equation used to calculate the KL divergence between two probability distributions p(x) and q(x) is as follows:

The KL divergence will be zero, or minimum, when p(x) is equal to q(x) at every other point.

Due to the asymmetric nature of KL divergence, we shouldn't use it to measure the distance between two probability distributions. It is therefore should not be used as a distance metric.

主站蜘蛛池模板: 苏州市| 自贡市| 铁岭县| 吉安市| 蚌埠市| 阿坝| 剑阁县| 布拖县| 金湖县| 太白县| 日照市| 大埔区| 安溪县| 达孜县| 江阴市| 江口县| 廊坊市| 衡阳县| 平利县| 桦川县| 定陶县| 新竹县| 乐安县| 自贡市| 汕头市| 玛沁县| 垣曲县| 奉贤区| 茶陵县| 韶关市| 桐城市| 太谷县| 唐山市| 综艺| 禄劝| 西安市| 淮阳县| 西华县| 黄龙县| 洞头县| 灵台县|